5 Reasons Why The Cloud Is Still Not 100% Secure
In the last year, many big cloud companies have come under cyber attacks leading to outages and data losses
Cloud companies offer tremendous trade-offs to businesses in terms of flexibility of scale, better security, reduced manpower and maintenance cost. The majority of organizations and individuals are now convinced of the value of the cloud and are starting to migrate their data over their. Now, it’s generally assumed that ones data is more secure within the cloud than it would if it were residing inside an unsecure desktop or server room. Unfortunately, this assumption is not true all the time.
If there is one area that is common between Google, Apple, Microsoft, Adobe, Spamhaus, American Express, Evernote, Facebook and Twitter it is the vulnerability of a cyber attack. All these organizations use cloud solutions for their business and have been victims of cyber attacks over the past year. Some of them have publicly acknowledged that data breaches have taken place during such attacks.
Many cloud subscribers today wonder why these high profile tech companies are unable to keep their data safe & secure. Here are five such reasons why this may be the case:
1) Dynamic nature of business and inherent complexity
For an end user, the services offered by cloud companies are structured in very simple manner. But the structure inside the cloud is inherently complex. Multiple customers share physical databases, file servers, web servers and disk spaces. It is only logical and technical implementation and rarely the physical separation that keeps them apart.
Moreover, organizations’ business requirements and thus cloud needs keep changing. As a result, regular restructuring of security controls becomes essential. It’s a daunting task to maintain security controls for such a dynamic and complex environment at the cutting edge so that they cannot be exploited. Any slip-up there opens the door for hackers.
2) Cloud companies cannot own 100% of responsibility to make it secure
Organizations often overlook their own responsibilities when they offload a business domain to the cloud. While the service provider will do its best, it cannot ensure absolute safety at the subscriber’s end. Organizations themselves have to ensure that their own systems are patched; the access to the cloud is for authorized users, there are no stale users in the list and encryption keys are kept safe.
3) Increased sophistication of cyber attacks
In recent years, the hacker community is better organized, and they receive huge funding. In certain cases organizations and governments back them. The change is evident in the speed at which zero day vulnerabilities are ready for exploitation, the size of payload and comprehensive functionality available in malware. In the month of March DDoS attack on Spamhaus was able to generate 300 gigabits per second, something that was unheard of before. It is not easy to completely ward off such sophisticated and powerful attacks.
Notable cyber attacks in 2013
|Feb 2013||Twitter announces in a blog post to have detected unusual access attempts to the accounts of 250,000 users. As a consequence the affected users’ accounts are reset.|
|Feb 2013||Hit by targeted attacks and admits to have been by a watering hole attack in January.|
|Feb 2013||Apple||Apple admits to have been hit by the same sophisticated cyber attack that targeted Facebook. The culprit is iPhoneDevSDK, a forum compromised to serve a malware exploiting 0-day vulnerability.|
|Feb 2013||Microsoft||With a scant statement on its Security Response Center blog, Microsoft admits to have been targeted by the same cyber attack that hit Facebook and Apple.|
|Feb 2013||American Express||In name of #OpBlackSummer. TunisianCyberArmy1 AKA @TN_cyberarmy claims to have hacked American Express and to have stolen 2 Gb of data.|
|Mar 2013||Spamhaus||Spamhaus is the victim of massive DDoS attack made with DNS Amplification and reaching a peak of 300 Gbps.|
|Apr 2013||WordPress||Security analyst from at least three Web hosting services detect an ongoing attack using more than 90000 IP addresses to brute-force crack administrative credentials of vulnerable WordPress systems.|
|Apr 2013||The Bangladeshi hacker TiGER-M@TE defaces the Kenyan domain of Google (google.co.ke)|
|May 2013||Drupal||Passwords for almost one million accounts on the drupal.org website are reset after hackers gain unauthorized access to sensitive use data exploiting vulnerability in an undisclosed third party application|
|Jul 2013||Apple||Extended outage on its developer portal (developer.apple.com) due to an intruder. Apple does not rule out the possibility that some developers’ name, addresses may have been accessed.|
4) Ascertaining jurisdiction is difficult in a virtual environment
Virtualization is amongst the founding principles of cloud computing. For a subscriber, it is not easy (at times impossible) to find out where exactly their data is stored. The location may be a different data centre in a different city, state or country altogether. Unless jurisdiction is ascertained it is difficult to take help of the law and precious time gets wasted. In case of a breach it becomes difficult to seek legal help and go after the culprits. This situation works to the advantage of the hacker community and many times they continue to remain at large.
5) Vulnerable users are everywhere
Any amount of security is not enough if there are vulnerable users in the system. Despite all those trainings and awareness programs, people make mistakes and thus expose the whole system to security risks. Use of easy or predictable passwords, sharing of accounts, falling prey to phishing / vishing attacks continues to happen. In the end, the hackers need just one small door to enter the fortified castle.
While cloud solutions are here to stay but so are the cyber attacks on them. Organizations and individuals must weigh pros and cons of cloud solutions before embracing it.
By Manoj Tiwari
(Image Source: Shutterstock)