Why All Those New Google / Amazon Data Centers Won't Really Go To Waste – Cloud Computing's First Supercomputer

Why All Those New Google / Amazon Data Centers Won’t Really Go To Waste – Cloud Computing’s First Supercomputer

As the market leaders of Cloud Computing’s rapidly growing industry, both Google and Amazon are looking to steadily increase the size of their data centers. However, many opponents to this idea are asking questions such as what will happen if or when the need for these data centers falls? Unlike the rest of us who are using Google and Amazon Cloud services in an elastic and dynamic manner as needs require it, as the actual hardware-backed Cloud providers they won’t be able to be so flexible. Well, if it does occur that public Cloud needs should drop, Amazon has found another use for their growing data centers in the form of Cloud Computing’s first supercomputer.

Stringing together a cluster of 30,000 processing cores, Amazon’s EC2 or Elastic Compute Cloud has managed to achieve the rank of 42 in the top 500 supercomputer ranking of the world. Granted, it isn’t the first in performance with a score of 240 trillion calculations per second but it is by no means an average performer either. The main point is that it is available to anyone, unlike your usual supercomputer cluster which has been built with a dedicated purpose in mind and therefore has rather limited access (and even longer waiting lines).

Amazon proved this by doing an actual paid-for supercomputing process at a mere $1279 an hour. While this may seem like a lot to some people, the people who have set up an actual supercomputer cluster will be shaking their heads in disbelief (and probably regret!) at the millions of dollars used to create a dedicated supercomputer cluster, much less keep one running. What boggles the mind even further is that Amazon did this while running all of their other Cloud related services at the same time.

While there are more supercomputers now than there were before, the need for supercomputing has also increased. More and more scientists require supercomputer simulations for the best results in their DNA sequencing, molecular dynamics and so forth. These days it is not just the scientists who need a supercomputer, as big data continues to grow, the need for a supercomputer to do global financial risk analysis or to render the latest in 3D entertainment has also grown exponentially.

This means that if a few couple of thousand processing cores should grow unused during the lean months, Amazon or Google can just opt for a supercomputer processing deal for all those currently waiting in line for a turn at the other dedicated supercomputer clusters. Bear in mind that since this is just the first Cloud Computing based supercomputer process, there are bound to be more areas in which it can be optimized even further. With Amazon’s EC2 already achieving a very impressive 42 rank placing it isn’t too far of a stretch to expect a place in the top 10 for the next Cloud Computing based supercomputer.

By Muz Ismail

FacebookTwitterLinkedInGoogle+Share

3 Responses to Why All Those New Google / Amazon Data Centers Won't Really Go To Waste – Cloud Computing's First Supercomputer

  1. WFTCloud – Leaders in SAP Private Cloud Reveals Top Trends of 2011 as well as … | Cloud Business | Wanting to Run a Business in the Clouds? Don't Know Where to Start? says:

    [...] Why All Those New Google / Amazon Data Centers Won't Really Go To Waste … As the marketplace leaders of Cloud Computing's fast flourishing industry, both Google as well as Amazon have been seeking to usually enlarge the distance of their interpretation centers. However, most opponents to this thought have been asking questions such as what will occur if or when … Read some-more upon CloudTweaks News [...]

Join Our Newsletter

Receive updates each week on news, tips, events, comics and much more...

Advertising Programs

Click To Find Out!

Sponsored Posts

Sponsored Posts

CloudTweaks has enjoyed a great relationship with many businesses, influencers and readers over the years, and it is one that we are interested in continuing. When we meet up with prospective clients, our intent is to establish a more solid relationship in which our clients invest in a campaign that consists of a number of

Popular

Top Viral Impact

Cloud Computing Adoption Continues

Cloud Computing Adoption Continues

Cloud Computing Adoption Continues Nowadays, many companies are changing their overall information technology strategies to embrace cloud computing in order to open up business opportunities.  There are numerous definitions of cloud computing. Simply speaking, the term “cloud computing” comes from network diagrams in which cloud shapes are  used to describe certain types of networks. All

Cloud Infographic – The Internet Of Things In 2020

Cloud Infographic – The Internet Of Things In 2020

Cloud Infographic –  The Internet Of Things In 2020 The growing interest in the Internet of Things is amongst us and there is much discussion. Attached is an archived but still relevant infographic by Intel which has produced a memorizing snapshot at how the number of connected devices have exploded since the birth of the

Cloud Infographic – Big Data Survey: What Are The Trends?

Cloud Infographic – Big Data Survey: What Are The Trends?

Jaspersoft Big Data Survey Shows Rise in Commitment to Projects and Decline in Confusion Nearly 1,600 Jaspersoft Community Members Participate in Second Jaspersoft Big Data Survey San Francisco, February 4, 2014 – Jaspersoft, the Intelligence Inside applications and business processes, today shared results from its Big Data Survey. Nearly 1,600 Jaspersoft community members responded to

Can I Contribute To CloudTweaks?

Yes, much of our focus in 2015 will be on working with other influencers in a collaborative manner. If you're a technology influencer looking to collaborate long term with CloudTweaks – a globally recognized leader in cloud computing information – drop us an email with “tech influencer” in the subject line.

Please review the guidelines before applying.

Whitepapers

Top Research Assets

HP OpenStack® Technology Breaking the Enterprise Barrier

HP OpenStack® Technology Breaking the Enterprise Barrier

Explore how cloud computing is a solution to the problems facing data centers today and highlights the cutting-edge technology (including OpenStack cloud computing) that HP is bringing to the current stage. If you are a CTO, data center administrator, systems architect, or an IT professional looking for an enterprise-grade, hybrid delivery cloud computing solution that’s open,

Public Cloud Flexibility, Private Cloud Security

Public Cloud Flexibility, Private Cloud Security

Public Cloud Flexibility, Private Cloud Security Cloud applications are a priority for every business – the technology is flexible, easy-to-use, and offers compelling economic benefits to the enterprise. The challenge is that cloud applications increase the potential for corporate data to leak, raising compliance and security concerns for IT. A primary security concern facing organizations moving