The Tolly Group Report: How Dimension Data Beat Out Some Big Players

The Tolly Group Report: How Dimension Data beat out some big players to help keep your data up to date

(Update Revision: Initial Post, August 30th)

The next time you check out busy commercial websites – those, for example, that talk about products, sell them, ship them and generate buzz and conversation about them, spare a thought for all of the billions of bits of data running around behind the scenes to make these sites’ videos, promos and “buy now” catalogues work smoothly, reliably and securely. Much of the infrastructure behind sites like these comes to you courtesy of a few organizations that have recognized the need for a more cohesive approach to collection and redistribution of data on the cloud, through the use of a “network-centric,” rather than “best effort” structure.

The technological wizardry behind complex websites tends to go unnoticed by the average consumer; at least until something goes wrong, at which point the great “fail whale” emerges to spoil the fun.

The cloud is growing by leaps and bounds, but a great deal of the infrastructure is built on existing components. It can often be a hodgepodge of servers and programs built using elements that were not always designed to scale up to the degree and with the versatility currently required. “Cloud” may exist at the top of every CIO’s agenda, but, according to Gartner Research, it still forms a relatively small portion of the 3.7 trillion dollar IT industry.

This means we are still in the early days of the cloud as a primary technology. It has a way to go to emerge as a platform for more than just testing and development, and to become the place for hosting mission-critical data applications.

Enter the Tolly Group.

The Tolly Group was founded in 1989 to provide hands-on evaluation and certification of IT products and services. In a recent study, conducted in May 2013, Tolly researchers tested the cloud performance of some major providers: Amazon, Rackspace, IBM and Dimension Data in all four areas: CPU, RAM, storage and network performance. Their findings exposed the price and performance limitations of today’s “commodity” or “best effort” clouds that rely on traditional, server-centric architectures. The report found that of these four big players, the network-centric approach used by Dimension Data’s enterprise-class cloud helped lower cost and risk, and accelerate migration of mission-critical apps to the cloud.

Keao Caindec, CMO of the Cloud Solutions Business for Dimension Data was obviously pleased with the results of Tolly’s stringent testing, but not surprised. He points out that the report tells an interesting story. He says it shows how not all clouds are created equal, and that there is a big difference between providers. This, he believes, will force end-users to look more critically at underlying performance of any provider they choose to do business with.

As an example, Caindec points out that when someone goes and buys a router switch or server, these pieces come with specs.  But such specs don’t exist broadly in the cloud world. In many cases, he says, clouds were developed as low cost compute platforms – a best effort. Now, however, this is not enough. A provider must demonstrate a great deal more reliability in terms of speed, security and scalability – for example, designing an application to scale either up or out. When Scaling up, a provider must be able to add more power to the cloud server. When scaling out, it must be able to easily add more instances. He points out that clients in a growth phase must be careful about scaling up, since such expansions may not lead to the desired level increased performance.

Caindec points to some specific types of work that Dimension Data does with its high-profile clients: “We help them with their websites by leveraging the public cloud for testing and development. This allows granular configuration of the server, which means that each server is configured with as much storage/power as is needed.” He points out that customers often need to make sure they are not buying too much of one resource. For example a database app needs lots of memory, maybe 32 Gig of memory on a server, but not necessarily a lot of computing power. Dimension Data, he says, takes care to help clients to configure exact amount of resources necessary, allowing them to save money by not over-provisioning.

Caindec finds the Tolly study to be eye-opening primarily because it begs the question: are low cost clouds really low cost? “If the model is more best effort and because of that you have to run more servers, are you being as economical as you could?” For the most part, he points out, the costs of cloud providers are similar. But performance levels vary much more dramatically. In other words, “You may not be saving all the money you could. You may find a lower cost per hour, but in a larger environment, especially when running thousands of servers, this does not become economic.”

Caindec points out that at this point in IT history there is still a great deal that is not well understood. There are not a lot of statistics. He hopes that IT managers and CTOs everywhere will be able to obtain more granular insights from the full Tolly Report. Insights such as the fact that more memory does not mean applications will run better or provide better throughput. “If you scale up the size of the server, the server runs faster, but requires higher throughput to reach other servers.” He says companies must be careful to benchmark their own applications. It is not necessary to hire a high-profile testing firm like Tolly to do this, however; testing tools are available publicly, but he strongly advises more testing and awareness as standard practice.

By Steve Prentice

Darach Beirne

Take Control of Telecom by Being Your Own Carrier

Being Your Own Carrier Departments and organizations of all sizes and across all industries are transitioning away from traditional hardware IT systems and embracing SaaS-based cloud offerings. The global pandemic has spurred greater cloud adoption, ...
Gary Bernstein

AWS General Release of Amplify Flutter

The AWS General Release of Amplify Flutter The Amazon Web Service has announced that the Amplify Flutter is now generally available in a way to help make flutter apps easier and more accessible. According to ...
Mark Barrenechea

Introducing the Information Advantage

Technology. Information. Disruption. The world is moving faster than ever before at unprecedented scale. Businesses today are operating in the next industrial revolution, and the rules have changed. This is Industry 4.0. It is imposing ...
Jim Fagan

Submarine Fiber Life Extension: Bridging The Cloud Capacity Shortage

Submarine Fiber Life Extension There has been no lack of media attention given to the fact that Big Tech is building private subsea cables. Big cloud providers like Google, Facebook, Amazon and Microsoft have invested ...
Virtana

Episode 8: Managing Cloud Strategy During the Chaos of 2020, Plus an Outlook for 2021

An Interview with Kash Shaikh, CEO of Virtana Companies are wrestling with the idea of moving to the cloud, staying on-prem or finding a hybrid solution. Kash Shaikh, the new CEO of Virtana, looks at ...
Shells.com – Your Personal Cloud Computer

Shells.com – Your Personal Cloud Computer

Personal Cloud Computer Shells, a robust virtual desktop infrastructure, ensures better performance by enabling its users to incorporate a layer of virtualization between the control server and any device that they choose. This way, it ...