What Scientists Want From Their Next Cloud Supercomputing Instance
Recently, a report was made by the Magellan project regarding the possibilities and viable use of Cloud Computing for scientific purposes. Like most scientific reports, this contained a lot of Yes, No and Maybe but the bottom-line at the end of the report was that the DOE (US Department of Energy) thinks that its current DOE supercomputing centers are better equipped for scientific supercomputing. However, they also made it clear, in a particularly tactful manner, that they would gladly switch over to existing commercial Cloud Computing offerings provided that these offerings give them a little bit more in the way of personalization.
According to them, it wasn’t the fact that existing DOE supercomputing systems would be enough to fulfill current demand, but that current commercial Cloud Computing offerings such as Amazon’s Elastic Compute Cloud (EC2) were just too generic for them to do their scientific analysis properly. Existing DOE supercomputer centers had more experienced face-to-face staff translating scientific hyperbole into meaningful questions as compared to the scientists having to frame such questions themselves using Amazon’s admittedly generic interface. The hint here is rather obvious for those seeking to be the next Cloud Supercomputing cluster (also a good idea for a startup) for the ever-growing needs of the scientific community: Make a custom wizard for scientists.
Rather controversially, the report also stated that current DOE supercomputer centers were several times more cost effective than existing commercial Cloud computing offerings. However, this was mainly because of the reason given above, that it cost less because they had to spend more time and needed more help translating their scientific findings into optimized code.
The rest of the report contains more or less the same thing but key amongst these statements was that they did the study starting from 2009 to 2011, a time when many of the Cloud’s offerings had yet to mature. Personally, I feel that current Cloud Supercomputing offerings are definitely better than those from the last two years, but it seems very unlikely that another two-year study will be underway any time soon. The DOE remains firm about their findings and recommendations, which will be presented to the Computer Science and Telecommunications Board (CSTB) of the National Academies in February 2012.
The DOE did not deny that they have a long waiting list, tactfully stating that their existing DOE supercomputers had a high utilization rate. They also implied that Cloud Computing could be applied as a business model at DOE supercomputing centers, so I suppose that not everything is wrong concerning Cloud Computing for the scientific community.
The full Magellan Report on Cloud Computing for Science can be viewed here.
By Muz Ismial
- Cloud Security Alliance Big Data Working Group Releases ‘100 Best Practices in Big Data’ Report - August 26, 2016
- Pitney Bowes Selects Aria Systems for Billing on the New Commerce Cloud - August 23, 2016
- RTI Presents “Fog Computing Is the Future of the Industrial IoT” Complimentary Webinar - August 22, 2016
- Uber To Take Legal Action Against Transport Authority in London - August 17, 2016
- $150M Investment Into Velodyne LiDAR From Ford Motor And Baidu - August 16, 2016