The Importance of Cloud Backups: Guarding Your Data Against Hackers

The Importance of Cloud Backups Cloud platforms have become a necessary part of modern business with the benefits far outweighing the risks. However, the risks are real and account for billions of dollars in losses across the globe per year.…

How To Be Data Compliant When Using The Cloud

Data compliant

Companies using the cloud for data storage, applications hosting or anything else, have to carefully consider data compliance. Governance, risk management and compliance professionals, as well as managers of information security, need to have a clear understanding (and staying up-to-date on) industry-specific regulation as well as all rules relevant to the country (or countries) they operate in. Proving end-to-end compliance requires data transparency, both inside and outside of the company. To stay on track, a system that provides a framework for stringent data governance and risk management is a must-have.

Out of sight, not mind

A company’s data is always its own responsibility, regardless of whether it is stored on servers in their own data center or in the cloud. The cloud is a business tool; contracting the services of a cloud provider, which doesn’t change how compliance rules apply or shift in terms of the responsibility of the provider. This is also the case when it comes to transferring data and holding data in a new location.

data

The first step towards data compliance is to understand the types of data held and the rules and regulations that apply to it. Considerations around the cloud are secondary to this essential first step. Types of data with particular considerations include patient data and personally identifiable information (PII), customer and employee data. However, this is a list that can never be fully exhaustive.

To be compliant, a data model should:

  • assign a classification to each type of data to reflect its type and sensitivity. Is it Restricted? Confidential? Private? Public domain? This sets the baseline for how that information should be treated
  • address regulatory demands, which can be many and complex. In truth, understanding regulatory requirements can also help with data classification.

Knowing the rules

Industries of all types have data compliance considerations and all regulations relevant to the particular business have to be taken into account. In financial services, regulation from the U.S. Securities and Exchange Commission, SEC rule 17a-4, outlines stringent requirements around data retention and accessibility for companies trading or brokering financial securities. Companies holding or transferring medical information have to comply with controls required by the Health Insurance Portability and Accountability Act (HIPAA).

Then there are region-specific regulations such as the General Data Protection Regulation (GDPR), which comes into effect soon in Europe and requires companies to report data breaches within just 72 hours. It also tightens up controls around data retention and hefty fines are set to be imposed for any violations.

Understanding the many regulations around data capture, sharing and use and complying with them is challenging enough, but for most companies it is especially difficult to stay compliant. Regulations update all of the time, so companies need the flexibility to stay current and to keep the systems in place to manage activities. This relates not only to how companies manage data, but also to how and when they report on their data handling.

Visibility is essential for compliance

Reporting can be viewed as a burdensome activity that takes place at the end of an operation but in truth, the demands of reporting can provide a useful framework for establishing transparent, visible operations that will stand up to external scrutiny.

Supply chain management, with its complexity of supplier relationships and continually moving parts is an excellent case in point. Data is created all the time in the workings of the supply chain as new suppliers come on-stream, previous suppliers leave, product components get sourced, approved and supplied and goods are manufactured and shipped.

Companies without total data transparency are unable to confirm they comply with a range of rules created to safeguard consumers, businesses and workers and to execute national and international policies. These cover such things as the inclusion of only safe ingredients in pharmaceuticals and the exclusion of conflict minerals from manufacturing. Companies without complete data visibility will struggle to identify where the components or ingredients in their products came from.

The path to automation

On the whole, companies tend to perform due diligence when evaluating a cloud solution and provider. At the start of the working relationship they have a complete understanding of privacy issues, data location and data controls, but unfortunately they don’t often put much time into setting in place measures and processes to maintain this rigor. Vendors need to provide automated checks or, at the very least, a framework for continually updating the client on the measures and controls that are being met.

Any system holding data classified as confidential or above should provide verification – preferably automatic – of log-ins and rights to access. It isn’t sufficient to learn at the start of the relationship that this is done, it needs to be demonstrated on a regular basis. This proactive demonstration is the first step on the path towards automated risk management and compliance.

Compliance at each stage

How and where data is stored has downstream implications. Classified incorrectly, and data that should be strictly internal could end up with an outside supplier. As data travels, there are opportunities to use the full range of capabilities technology provides to secure and protect data and comply with demands around how it should be handled.

This starts at a basic level, such as making use of categories and flags in email packages. Applications in the cloud can handle tagged information in a prescribed way to control the flow of information according to the markers set, so the first stage of classification is particularly important. Beyond this, sits encryption and encryption key management whereby control resides with the data’s owner. When the client has the key – not the cloud vendor – they can revoke it at the first sign of compromise, rendering the data inaccessible. By storing the key separately to the data held in the cloud, the risk of it becoming compromised in the event of a data breach is minimized.

Then, beyond encryption comes scanning technology. This identifies data of a particular type through pattern matching and can flag any causes for concern, such as social security numbers within data without sufficient privacy classification. This is an additional failsafe, and a sophisticated level of automation that is becoming baked into cloud services.

As more data moves to the cloud, businesses need to know it is protected and that it is collected, stored and shared in a compliant way. In the past, IT was involved in the set-up and updating of business systems. Now, Information Security needs to be completely onboard throughout to preserve the integrity of the company’s operations and information. Data is at the heart of each business and as it now so often resides outside the four walls of the company, there is an added responsibility on businesses to live by data protection and compliance principles.

By Vidyadhar Phalke, Chief Technology Officer, MetricStream

Vidya Phalke

Vidya Phalke is responsible for MetricStream's technical architecture and strategy. Prior to being promoted to the CTO position, Vidya served as Vice President of Product Management and Engineering where he was responsible for MetricStream's Software Products and Platform Delivery. Starting with MetricStream in 2003, Vidya has been instrumental in developing an industry-leading GRC software platform. Before joining the software industry, Vidya earned a PhD in Computer Science from Rutgers University, where he won two Small Business Innovation Research grants for his research on databases and network optimization.