Principles of an Effective Cybersecurity Strategy

According to MetricStream’s, ‘The State of Cyber Security in the Financial Services Industry’ report, around 66 percent of financial services institutions have faced at least one cyber-attack in the last 12 months. The cost of this can even result in a complete shutdown of the business."

Click To See - 10 Live Hacking Tracking Maps

Addressing Security, Quality and Governance of Big Data

open source cloud

Addressing Data Quality

Article sponsored by SAS Software and Big Data Forum

Big Data is quickly being recognized as a valuable influencer of business strategy, able to improve productivity, streamline business processes, and reduce costs. However, not all data holds the same value and organizations need to take care to address the quality of the data they’re exploiting, while carefully managing security and governance concerns.

Data Quality

Blindly trusting business reports to be based on sound and quality information can lead not only to embarrassment but also business decline should the foundational data be found lacking. For this reason ensuring the data your organization employs in its analytics and reporting is of both relevant and high quality is of the utmost importance. While only using high-quality data is a sound principle, it is a case more easily said than done. Understanding where data originated, who has control and accountability for it, and precisely what the quality standards of your organization’s data should be, are significant tasks that must be undertaken. Moreover, while software exists which helps with data correction and error analysis, such tools only address part of the problem. To best meet the challenge of ensuring high-quality data, businesses need to implement a plan that helps identify quality issues, chase down the causes, and create problem resolution strategies.

carol-newcomb

Carol Newcomb, Senior Data Management Consultant at SAS, suggests a sustainable plan for managing data quality, warning that the process is not likely to be simple and including many steps such as the implementation of rules for collecting or creating data, data standardization and summarization rules, data integration rules with other sources of data, and hierarchy management.

Newcomb asserts that an effective, sustainable data quality plan should include the following five elements:

  • Elevate the visibility and importance of data quality.
  • Formalize decision making through a data governance program.
  • Document the data quality issues, business rules, standards and policies for data correction.
  • Clarify accountability for data quality.
  • Applaud your successes.

Considering reports from Gartner analysts that by 2017, one-third of the world’s largest companies will experience information crises due to an inability to adequately value, govern and trust their enterprise information, it’s crucial that businesses put data quality programs in place.

Data Regulations, Security & Privacy

The ITRC’s latest Data Breach Report points to 500 data breaches in the first half this year with more than 12 million records compromised. Such breaches expose Social Security numbers, medical records, financial information and more, putting individuals at risk. It’s no wonder privacy is such a concern as the mountains of data increase exponentially day by day. Though the ability to track customers and predict future behaviors is of great benefit to many companies, it’s a trade-off for most consumers as privacy is eclipsed by the convenience and assistance that data analysis and prediction provides. And although organizations such as Google promise data anonymization, it’s impossible to know how carefully our privacy is guarded, while the increasing popularity of the Internet of Things means our engagement with the internet and data collection is only escalating.

shutterstock_297436073

(Image Source: Shutterstock)

This extensive data does offer many noble assistances, not least of all patient monitoring for superior and prompt medical care, and real-time data utilization in classrooms ensuring education systems are functioning efficiently. And so perhaps data governance and regulation can help mitigate the risks without abandoning the rewards. States Scott Koegler of Big Data Forum, “Pulling actionable but protected data is key to developing advanced business practices while protecting customer identity.” A ‘Zero Trust Data’ policy is a potential solution wherein access to data is denied by default and always requires verification. In this way, data is made available to those legitimately requesting it, but information is correctly de-identified.

By designing efficient data management policies and properly adhering to regulations, companies are able to reap the benefits of Big Data without infringing on the privacy and security of individuals’ data. Furthermore, applying quality control systems to data collection and management enhances the value businesses draw from their data without needlessly breaching confidentiality. Ultimately, the time expended in data security, quality, and governance benefits the businesses who use it as well as the individuals and communities to whom it relates.

By Jennifer Klostermann

About Jennifer Klostermann

Jennifer Klostermann is an experienced writer with a Bachelor of Arts degree majoring in writing and performance arts. She has studied further in both the design and mechanical engineering fields, and worked in a variety of areas including market research, business and IT management, and engineering. An avid technophile, Jen is intrigued by all the latest innovations and trending advances, and is happiest immersed in technology.