The Cloud: Focusing On Cloud Performance – Part 2

The Cloud: Focusing On Cloud Performance – Part 2

Continued from Part 1

Companies are usually recommended to do a workload analysis exercise before deciding on moving a process to the cloud and choosing a provider. For vendors it is, again, critical to understand in both business and computational terms (like number of records, size distributions, CPU and memory consumptions) the loads of the companies they wish to serve. While this may sound obvious, the outcome of a study conducted by the IEEE will explain why it needs to be articulated. The study published in November 2010, Performance Analysis of Cloud Computing Services for Many-Tasks Scientific Computing, found that the major four cloud computing commercial services were an order of magnitude less in performance to be useful for the scientific community. It is understandable that the cloud was not designed specifically for the scientific community in the first place. Yet nothing prevented cloud services from being touted as viable alternatives to grids and clusters for the scientific community. This study, while seemingly unconnected to industry, may still offer the relevant message to potential cloud service vendors. The cloud should be designed to cater to the workload. And perhaps, for related reasons, it will be good to have clouds for specific tasks or industries. Another observation of the study that should not be lost on us is that virtualization and resource time sharing add significant performance overheads. Such overheads should be thoroughly assessed at the providing data center. Current practices do not do more workload processing for less hardware. On the contrary, due to overheads of virtualization, more hardware is required than would be for doing the job by hardware installed at individual companies. However, savings could result from using hardware resources that would not be used except during peak loads in the case of individual installations.

The Cisco Global Cloud Index provides forecasts for IP traffic and guidance for both vendors and communities dependent on networks. Such forecasts are invaluable for both cloud and network providers. The forecasts seem to be based on current applications that are moved to the cloud even though the Cisco forecast does mention that by 2015 about 50 percent of all workloads will be catered by cloud data centers. Sizable workloads, such as order fulfillment for a major retailer or an OSS application for a telecom provider, could easily throw off such forecasts, when added to the cloud. The study indicates a very useful point: as more applications move to the cloud, more traffic that would remain within a data center would be moved to the Internet. Also, redundancy and availability features that are so asked for of the cloud will cause more traffic to be routed through the Internet. Online transaction-intensive workloads need tons of network bandwidth. For these reasons and many more it is important for a cloud provider to quantitatively assess the traffic that would be generated through its offering. Hence the cloud provider needs to work with the network providers, give them the assessments, and ensure that network bandwidth and capacity is available prior to having the offering up and running in the scale required. This might be dismissed as superfluous in the light of techniques meant to allow elastic IP capacity, but is still a required exercise to quantify the size of the task at hand in network terms and evaluate if network providers can match such size—with or without such techniques.

The performance-related promises that the cloud has to offer should not slacken subscribing companies’ interest in understanding their own performance requirements thoroughly and completely. While the onus of performance has indeed shifted to the vendor, having a grip over their requirements is still needed. Traditionally, understanding the performance requirements of client companies has been a weak link between clients and vendors. In cloud environments this link could snap altogether. Peak business loads and volumes and average usage patterns should be known to both. Expectations, such as application response times and operational constraints (such as time windows available for batches), should also be known to both. Well-understood requirements not only help define meaningful SLAs between the client and provider but, when provided earlier, can help the provider design robust offerings.

By Suri Chitti

Suri Chitti (suri_chitti@hotmail.com) is a software and technology management professional. He has packed many years of experience and insight having worked in different industrial domains, for different stakeholders like companies, vendors and consultancies and has contributed to different aspects of software production like functional analysis, testing, design and management. 

About CloudTweaks

Established in 2009, CloudTweaks is recognized as one of the leading authorities in connected technology information and services.

We embrace and instill thought leadership insights, relevant and timely news related stories, unbiased benchmark reporting as well as offer green/cleantech learning and consultive services around the world.

Our vision is to create awareness and to help find innovative ways to connect our planet in a positive eco-friendly manner.

In the meantime, you may connect with CloudTweaks by following and sharing our resources.

View All Articles

Sorry, comments are closed for this post.

Don’t Be Intimidated By Data Governance

Don’t Be Intimidated By Data Governance

Data Governance Data governance, the understanding of the raw data of an organization is an area IT departments have historically viewed as a lose-lose proposition. Not doing anything means organizations run the risk of data loss, data breaches and data anarchy – no control, no oversight – the Wild West with IT is just hoping…

Moving Your Email To The Cloud? Beware Of Unintentional Data Spoliation!

Moving Your Email To The Cloud? Beware Of Unintentional Data Spoliation!

Cloud Email Migration In today’s litigious society, preserving your company’s data is a must if you (and your legal team) want to avoid hefty fines for data spoliation. But what about when you move to the cloud? Of course, you’ve probably thought of this already. You’ll have a migration strategy in place and you’ll carefully…

Despite Record Breaches, Secure Third Party Access Still Not An IT Priority

Despite Record Breaches, Secure Third Party Access Still Not An IT Priority

Secure Third Party Access Still Not An IT Priority Research has revealed that third parties cause 63 percent of all data breaches. From HVAC contractors, to IT consultants, to supply chain analysts and beyond, the threats posed by third parties are real and growing. Deloitte, in its Global Survey 2016 of third party risk, reported…

The Security Gap: What Is Your Core Strength?

The Security Gap: What Is Your Core Strength?

The Security Gap You’re out of your mind if you think blocking access to file sharing services is filling a security gap. You’re out of your mind if you think making people jump through hoops like Citrix and VPNs to get at content is secure. You’re out of your mind if you think putting your…

Is Machine Learning Making Your Data Scientists Obsolete?

Is Machine Learning Making Your Data Scientists Obsolete?

Machine Learning and Data Scientists In a recent study, almost all the businesses surveyed stated that big data analytics were fundamental to their business strategies. Although the field of computer and information research scientists is growing faster than any other occupation, the increasing applicability of data science across business sectors is leading to an exponential…

Four Recurring Revenue Imperatives

Four Recurring Revenue Imperatives

Revenue Imperatives “Follow the money” is always a good piece of advice, but in today’s recurring revenue-driven market, “follow the customer” may be more powerful. Two recurring revenue imperatives highlight the importance of responding to, and cherishing customer interactions. Technology and competitive advantage influence the final two. If you’re part of the movement towards recurring…

Lavabit, Edward Snowden and the Legal Battle For Privacy

Lavabit, Edward Snowden and the Legal Battle For Privacy

The Legal Battle For Privacy In early June 2013, Edward Snowden made headlines around the world when he leaked information about the National Security Agency (NSA) collecting the phone records of tens of millions of Americans. It was a dramatic story. Snowden flew to Hong Kong and then Russia to avoid deportation to the US,…