Answers To Gartner’s Six Cloud Computing Risks
Cloud computing has been the subject of ever-increasing hype. Anything exposed to such publicity is always accompanied by criticism, whether it be constructive or destructive. Gartner, an information technology research and advisory firm, has completed a report which signifies some crucial risks in the cloud computing industry.
Given below, with appropriate answers to each, are six risks highlighted by the Gartner report.
1: Privileged User Access: A risk which deals with who manages the data of an organization in the cloud. Interestingly, the way most datacenters operate is that there are not very many people around. It is largely an automated process; software is in control of other software or data. In contrast, an organization could have untrustworthy or unreliable employees at its on-premises datacenter. The very fact that automated processes look after an organization’s data means that clouds are more secure compared to data in the hands of the organization itself.
2: Regulatory Compliance: A risk regarding certifications and regulations in relation to a cloud service. Here, the argument is that it is in the cloud service provider’s interest to get as many certifications as it can. Owing to the fact that Gartner’s report on cloud computing risks was published back in 2008, prominent cloud providers have actually acquired certification for their services and datacenters.
3: Data location: Organizations think of it as a big issue regarding what will happen if their data swims out of control. Taking a step back, if one pictures an individual walking out of his office with a laptop on which his critical data is stored, the chances are high that this laptop could be snatched from him. So, the risk of data location is much greater if one does not store one’s data in the cloud. An intelligent response to the menace of data location is to choose multiple cloud services and store different portions of data in different clouds, so decreasing the danger of data location.
4: Data Segregation: An aspect which deals with the issue that one’s data should not mix with someone else’s data. Yet again, the response to this issue is automation. Today’s cloud services use highly automated services which literally decrease the chances of data loss and data segregation to nearly zero.
5: Data recovery: A topic which implies that consumers might not be able to get their data back. Principally, if some data is mission critical to an organization, the organization will double or even triple back up. More importantly, an organization cannot blame a cloud service for a logical failure – an organization is responsible for deleting its own files, and it cannot hold a cloud responsible for its lost data.
6: Long-term Viability: An aspect which implies that the cloud provider remains in service for eternity. Ideally speaking, there are two aspects of this situation. The first facet, as mentioned before, indicates that an organization should keep its mission-critical data backed up with other cloud services or in-house datacenters. The second part deals with the continuity of a business service. A cloud provider can easily achieve a higher level than a business on its own, particularly in the case of today’s small-scale businesses. Looking at the cloud giants of today, they do not look like they are going to hit any difficulties anytime soon.
Going even further and taking a look from a different angle, there could come a point where cloud providers have grown to such huge sizes that it would not be in the interests of governments to intervene – similar to the banking scenario of today.
By Haris Smith
- The Converged Cloud – Is This What Businesses Are Looking For - August 6, 2012
- Uncertainties Surrounding Cloud Gamers - July 19, 2012
- How The Salesforce Social Enterprise Cloud Bridged The Gap Between Activision And Its Gamers - July 16, 2012
- Amazon Silk – Amazon’s Theories Sound Good - July 12, 2012
- Answers To Gartner’s Six Cloud Computing Risks - July 5, 2012