The Tolly Group Report: How Dimension Data Beat Out Some Big Players

The Tolly Group Report: How Dimension Data Beat Out Some Big Players

The Tolly Group Report: How Dimension Data beat out some big players to help keep your data up to date

(Update Revision: Initial Post, August 30th)

The next time you check out busy commercial websites – those, for example, that talk about products, sell them, ship them and generate buzz and conversation about them, spare a thought for all of the billions of bits of data running around behind the scenes to make these sites’ videos, promos and “buy now” catalogues work smoothly, reliably and securely. Much of the infrastructure behind sites like these comes to you courtesy of a few organizations that have recognized the need for a more cohesive approach to collection and redistribution of data on the cloud, through the use of a “network-centric,” rather than “best effort” structure.

The technological wizardry behind complex websites tends to go unnoticed by the average consumer; at least until something goes wrong, at which point the great “fail whale” emerges to spoil the fun.

The cloud is growing by leaps and bounds, but a great deal of the infrastructure is built on existing components. It can often be a hodgepodge of servers and programs built using elements that were not always designed to scale up to the degree and with the versatility currently required. “Cloud” may exist at the top of every CIO’s agenda, but, according to Gartner Research, it still forms a relatively small portion of the 3.7 trillion dollar IT industry.

This means we are still in the early days of the cloud as a primary technology. It has a way to go to emerge as a platform for more than just testing and development, and to become the place for hosting mission-critical data applications.

Enter the Tolly Group.

The Tolly Group was founded in 1989 to provide hands-on evaluation and certification of IT products and services. In a recent study, conducted in May 2013, Tolly researchers tested the cloud performance of some major providers: Amazon, Rackspace, IBM and Dimension Data in all four areas: CPU, RAM, storage and network performance. Their findings exposed the price and performance limitations of today’s “commodity” or “best effort” clouds that rely on traditional, server-centric architectures. The report found that of these four big players, the network-centric approach used by Dimension Data’s enterprise-class cloud helped lower cost and risk, and accelerate migration of mission-critical apps to the cloud.

keao-caindec

Keao Caindec, CMO of the Cloud Solutions Business for Dimension Data was obviously pleased with the results of Tolly’s stringent testing, but not surprised. He points out that the report tells an interesting story. He says it shows how not all clouds are created equal, and that there is a big difference between providers. This, he believes, will force end-users to look more critically at underlying performance of any provider they choose to do business with.

As an example, Caindec points out that when someone goes and buys a router switch or server, these pieces come with specs.  But such specs don’t exist broadly in the cloud world. In many cases, he says, clouds were developed as low cost compute platforms – a best effort. Now, however, this is not enough. A provider must demonstrate a great deal more reliability in terms of speed, security and scalability – for example, designing an application to scale either up or out. When scaling up, a provider must be able to add more power to the cloud server. When scaling out, it must be able to easily add more instances. He points out that clients in a growth phase must be careful about scaling up, since such expansions may not lead to the desired level increased performance.

Caindec points to some specific types of work that Dimension Data does with its high-profile clients: “We help them with their websites by leveraging the public cloud for testing and development. This allows granular configuration of the server, which means that each server is configured with as much storage/power as is needed.” He points out that customers often need to make sure they are not buying too much of one resource. For example a database app needs lots of memory, maybe 32 Gig of memory on a server, but not necessarily a lot of computing power. Dimension Data, he says, takes care to help clients to configure exact amount of resources necessary, allowing them to save money by not over-provisioning.

Caindec finds the Tolly study to be eye-opening primarily because it begs the question: are low cost clouds really low cost? “If the model is more best effort and because of that you have to run more servers, are you being as economical as you could?” For the most part, he points out, the costs of cloud providers are similar. But performance levels vary much more dramatically. In other words, “You may not be saving all the money you could. You may find a lower cost per hour, but in a larger environment, especially when running thousands of servers, this does not become economic.”

Caindec points out that at this point in IT history there is still a great deal that is not well understood. There are not a lot of statistics. He hopes that IT managers and CTOs everywhere will be able to obtain more granular insights from the full Tolly Report. Insights such as the fact that more memory does not mean applications will run better or provide better throughput. “If you scale up the size of the server, the server runs faster, but requires higher throughput to reach other servers.” He says companies must be careful to benchmark their own applications. It is not necessary to hire a high-profile testing firm like Tolly to do this, however; testing tools are available publicly, but he strongly advises more testing and awareness as standard practice.

By Steve Prentice

About Steve Prentice

Steve Prentice is a project manager, writer, speaker and expert on productivity in the workplace, specifically the juncture where people and technology intersect. He is a senior writer for CloudTweaks.

View Website
View All Articles

Sorry, comments are closed for this post.

Comics
Digital Identity Trends 2017 – Previewing The Year Ahead

Digital Identity Trends 2017 – Previewing The Year Ahead

Digital Identity Trends 2017 The lack of security of the Internet of Things captured public attention this year as massive distributed denial of service attacks took down much of the internet. The culprits? Unsecured connected devices that were easily accessed and manipulated to do the bidding of shadowy hackers. When you can’t access Netflix anymore,…

Disaster Recovery – A Thing Of The Past!

Disaster Recovery – A Thing Of The Past!

Disaster Recovery  Ok, ok – I understand most of you are saying disaster recovery (DR) is still a critical aspect of running any type of operations. After all – we need to secure our future operations in case of disaster. Sure – that is still the case but things are changing – fast. There are…

How To Humanize Your Data (And Why You Need To)

How To Humanize Your Data (And Why You Need To)

How To Humanize Your Data The modern enterprise is digital. It relies on accurate and timely data to support the information and process needs of its workforce and its customers. However, data suffers from a likability crisis. It’s as essential to us as oxygen, but because we don’t see it, we take it for granted.…

Moving Your Email To The Cloud? Beware Of Unintentional Data Spoliation!

Moving Your Email To The Cloud? Beware Of Unintentional Data Spoliation!

Cloud Email Migration In today’s litigious society, preserving your company’s data is a must if you (and your legal team) want to avoid hefty fines for data spoliation. But what about when you move to the cloud? Of course, you’ve probably thought of this already. You’ll have a migration strategy in place and you’ll carefully…

Is Machine Learning Making Your Data Scientists Obsolete?

Is Machine Learning Making Your Data Scientists Obsolete?

Machine Learning and Data Scientists In a recent study, almost all the businesses surveyed stated that big data analytics were fundamental to their business strategies. Although the field of computer and information research scientists is growing faster than any other occupation, the increasing applicability of data science across business sectors is leading to an exponential…

Why Security Practitioners Need To Apply The 80-20 Rules To Data Security

Why Security Practitioners Need To Apply The 80-20 Rules To Data Security

The 80-20 Rule For Security Practitioners  Everyday we learn about yet another egregious data security breach, exposure of customer data or misuse of data. It begs the question why in this 21st century, as a security industry we cannot seem to secure our most valuable data assets when technology has surpassed our expectations in other regards.…

Three Reasons Cloud Adoption Can Close The Federal Government’s Tech Gap

Three Reasons Cloud Adoption Can Close The Federal Government’s Tech Gap

Federal Government Cloud Adoption No one has ever accused the U.S. government of being technologically savvy. Aging software, systems and processes, internal politics, restricted budgets and a cultural resistance to change have set the federal sector years behind its private sector counterparts. Data and information security concerns have also been a major contributing factor inhibiting the…

The Cancer Moonshot: Collaboration Is Key

The Cancer Moonshot: Collaboration Is Key

Cancer Moonshot In his final State of the Union address in January 2016, President Obama announced a new American “moonshot” effort: finding a cure for cancer. The term “moonshot” comes from one of America’s greatest achievements, the moon landing. If the scientific community can achieve that kind of feat, then surely it can rally around…