Cloud Spectator Report
Cloud Spectator, a cloud benchmarking and consulting agency focused on cloud Infrastructure-as-a-Service (IaaS) performance, has today released the 2017 Top 10 Cloud IaaS Providers Benchmark for North America. Striving for transparency in the cloud market, Cloud Spectator’s report reveals up to 7.7x difference in value between the highest and lowest value providers. An extensive study providing stability and performance results for block storage, CPU, and memory for several virtual machine sizes, Kenny Li, CEO Cloud Spectator, explains to CloudTweaks the distinctive approach. “Cloud Spectator’s methodology is unique in that we built our test suite to be iterative to collect not only data on performance level, but also obtain a thorough reading on the stability of each environment. This report is based on a study that collected over 1.5 million data points. Accounts were set up anonymously to eliminate the possibility of bias, and three machines of each type were tested to increase the chance of testing different physical hosts. We try to be as thorough as possible to accurately represent each provider in whichever workloads are tested.”
The Risk of Overspending
Highlighting significant disparities across leading IaaS providers in price, performance, and stability, the report points to the risk that enterprise consumers overspend on cloud products. Remarks Li, “This report was created to provide transparency into a complex marketplace and raise awareness of the performance and cost differences that exist between cloud offerings. Taking advantage of these differences can substantially lower operating costs and increase ROI.” Now more than ever, it’s essential that organizations test cloud products, ensuring the services they receive are both necessary and efficacious, and confirming appropriate scalability; with the range and value offered by cloud IaaS providers, there’s no excuse for implementing any service less than optimal, nor overpriced.
Including cloud service providers 1&1, Amazon Web Services, CenturyLink, Digital Ocean, Dimension Data, Google Compute Engine, Microsoft Azure, OVH, RackSpace, and SoftLayer, the January 2017 test results attained by Cloud Spectator are separated into two categories: block storage performance and VM performance. Explains Li, “Infrastructure performance is a critical and often overlooked component of a cloud purchase decision. It can have a substantial impact on operating costs and application performance. Neglecting these considerations can lead to budget overruns and operational inefficiencies.”
Regarding price performance, Cloud Spectator measures the ratio of price and performance, and 1&1 receives the highest score due to strong VM performance and the most inexpensive package in the study. Some providers achieving above-average VM environment performance, however, lagged in this price performance index due to high costs. With regards to VM performance, performance variability and median performance were assessed; while the majority of the organizations analyzed achieved low performance variability and high median performance, the differences across performance of these top ten providers suggests a lack of standardization in public cloud IaaS. Microsoft Azure’s VMs displayed the highest median performance (performance index score of 92) and lowest performance variability (2% variability), and Amazon AWS, Google Compute Engine, and Microsoft Azure exhibited the least performance variability in a 24-hour testing period.
Evaluating block storage performance, it’s found that the majority of the top ten providers tested fell into the low performance variability and low median performance, however, Dimension Data’s disk performance variability surpassed 66% in particular scenarios. Rackspace achieved the highest in median disk IOPS performance along with moderate performance variability of 12%.
The overall CloudSpecs ScoreTM relies on a price-performance calculation that specifies how much performance a user acquires per unit of cost. It is noted, however, that the IaaS industry lacks a standard methodology for evaluation of cloud service providers, resulting in study methodology limitations. Furthermore, it should be noted that the synthetic testing conducted in Cloud Spectator’s report measured maximum sustainable performance over a 24 hour period, not representative of any specific workload. For organizations employing cloud service providers, the report underscores the necessity of careful consideration of performance needs, provisions, and costs before choosing a provider.
Find the complete report and detailed findings here.
(Test conducted by Cloud Spectator. Any questions regarding the methodology and accuracy of the report should be directed to Cloud Spectator)
By Jennifer Klostermann