The Great Debate: Is The Cloud Ready for Mission Critical Applications?
We all know cloud is the future of IT, but there are still a few unanswered questions. For instance, with all the high-profile cloud outages, is the cloud really ready for mission critical applications? This common question arises because inevitably, we wonder if the cloud is more or less reliable and secure than our own data centers.
In a recent survey from North Bridge Venture Partners on cloud computing trends, 50 percent of respondents said they were confident cloud solutions were viable for mission critical business applications. Only three percent of those surveyed believed cloud was too risky, which improved from 14 percent the year prior.
It’s clear many companies are transitioning to the cloud, and as more make the move, more applications are being transitioned as well. While this shift to the cloud is becoming more prevalent, decision makers should take caution and not just throw everything into one pot. In fact, the cloud is not one-size-fits-all; some applications are a better fit for a private cloud while others are better suited for the public cloud, or quite frankly – some applications should not move to the cloud at all.
In addition to the private-public cloud consideration, is the question of what kind of cloud service would be best.
Today, cloud services are typically one of three environments:
- SaaS (Software as a Service) could be a replacement for an application you currently run in
- PaaS (Platform as a Service) can provide the application deployment platform your application requires
- IaaS (Infrastructure as a Service) provides only the underlying servers, storage and network services needed
These cloud services are truly only the beginning. Going forward you can expect the cloud to become a highly specialized place. Specialization is a natural development of any new market – and cloud will be no different. Already a few of these specialized clouds have taken off. These include Desktop as a Service (DaaS) and Disaster Recovery as a Service (DRaaS). Eventually, we’ll reach the point of custom clouds – whereby organizations can leverage API’s to craft custom business processes that exist across many specialized clouds.
While that vision is still some way off, today success in the cloud depends on characterizing applications or workloads based on different factors and correctly determining the fit to a particular deployment model. When it comes to mission critical applications, correctly assessing the various factors are even more valuable since the business depends on their availability.
Some of the factors to consider include business concerns such as cost, efficiency, security and flexibility. Technology factors include performance, scalability, manageability and the operating environments required. Mission critical applications also require the highest levels of availability and disaster recovery, so those are key considerations.
Additionally the majority of enterprise applications aren’t monolithic but consist of different tiers such as a presentation layer, business logic and database/file system. If there are well-defined interfaces between these tiers, some may be suited for the cloud and others may not.
Another important consideration is whether the application can run in a virtualized server environment. Databases or applications that are performance-intensive may not run well in a virtual machine. With regards to bare metal, today’s public clouds just can’t handle applications that are not virtualized. There are some technologies out there that can support the provisioning of full physical servers while still retaining the flexibility and automation that cloud services provide. They can provide the capability needed to deploy mission critical applications in a private cloud.
This leads to a hybrid cloud strategy of utilizing public clouds for testing/development and non-mission critical applications, and private clouds for mission critical or non-virtualized applications. Undoubtedly as the range of services offered by public providers continue to expand, the lines will blur, and current technology impediments will be overcome. So if, you haven’t already taken the plunge, now is the time.
By Pete Manca, CEO of Egenera
Pete Manca brings over 25 years’ experience in enterprise computing and business development, with expertise in a wide range of mission-critical enterprise data center technologies. Previously as CTO and EVP of Engineering, Manca led product planning by working directly with customers to understand their most difficult challenges and guide into solutions.
Latest posts by CloudTweaks (see all)
- Improving Safety On The Digital Highway - March 26, 2015
- From AOL To Twitch: A Decade Of Big Data Hacks - March 26, 2015
- New Report Finds 1 Out Of 3 Sites Are Vulnerable To Malware - March 24, 2015