Why ‘Data Hoarding’ Increases Cybersecurity Risk

Data Hoarding

The proliferation of data and constant growth of content saved on premise, in cloud storage, or a non-integrated solution, poses a challenge to businesses, in terms of both compliance and security. This is not helped by the estimation that 80% of content most businesses have is unstructured. A general lack of understanding of how to manage the data and a reluctance to delete content, leads to ‘data hoarding’ and businesses seem unaware of what data is present and what value it holds. There is also confusion regarding management and governance.

Scattered content, regardless of where it is stored, poses a major security risk. Most businesses are unaware which data set is sensitive and will likely not have adequate measures in place to secure it against cyberattacks. Any security technology that is implemented to keep data secure cannot be effective unless it is clear what data is of value. For this reason, document classification must dovetail with security processes to identify, record, and potentially encrypt content in order to keep it safe both within and outside the firewall.

Unless companies embrace the power of open thinking and a new pragmatic approach to security, data breaches are going to become the “new normal”, leaving companies to deal with the inevitable fall-out and impact on brand reputation. Every business is now contending with the interplay between making information instantly accessible to a range of users and keeping it secure against malicious attacks.

Information stewardship is the key to maintaining a hold on the burgeoning data in today’s digital age, using individual business insights to inform decisions to implement processes that streamline the flow of data and improve user experience. Content must be correctly identified, categorised and deleted (where appropriate) to keep the problem of data proliferation from spiraling out of control. These decisions must be made by agreements between both high level executives and IT departments. Information stewardship cannot be solely the responsibility of either, and should instead be a cohesive approach. A successful breach could have lasting financial and reputational ramifications and so this internal collaborative approach has to be a priority and shared responsibility.

When identifying a solution to the problem of data hoarding, businesses must also keep the end users in mind and implement processes that allow them to access the content they need, when they need it, to ensure a smooth workflow. The solution must also allow collaboration to facilitate innovation and improve efficiency. Putting users’ requirements at the centre of the process ensures the best compliance with processes. In turn this streamlines and secures data.

To address the challenges that businesses face, an enterprise grade platform is needed rather than a product that has been fitted for use, to at once provide agility but must be constantly updated. Platforms that are open source, with thousands of “white hats” who can help fight the “black hats,” will help companies secure their most precious resource – intellectual property. Transparent, open systems by their very nature are hardened to prevent future breaches. For every new encryption or password management point solution enterprises put in place, there are likely hundreds of hackers (many from state actors) figuring out a way to compromise those security countermeasures. Enterprises need more brainpower to stay one step ahead; otherwise, they are just going to experience these breaches over and over again.

The deluge of data shows no signs of abating. Managing and extracting intelligence from data that is growing in silos will remain a challenge – not only will this information decrease in value, it will hamper decision making and increase in risk. Embracing open thinking and information stewardship will enable content to be identified, classified, curated, and governed throughout its lifecycle. Thus promoting business agility, better decision making and reducing the risks of data hoarding.

By Ankur Laroia

Deepak Jayagopal

Leveraging DevOps Infrastructure as Code to Improve Cloud Provisioning Time by 65%

Improving Cloud Provisioning Time Infrastructure provisioning used to be a highly manual process for Digital Service Providers (DSPs). Infrastructure engineers would rack and stack the servers and will manually configure them. Then they will install ...
Juan Pablo Perez Etchegoyen

The S/4 HANA Decade is Here: Three Tips for a Successful Migration

Three Migration Tips For organizations using SAP, migrating to S/4 HANA is a project that’s either in the works or on the horizon as the 2027 deadline for completion looms. The new generation of SAP ...
Ian Hayes

Pick The Right AWS Course And Ensure A Brighter Future Ahead

Picking The Right AWS Course As the leader of the pack, AWS (Amazon Web Services) is the fastest-growing public cloud service in the industry, and it's all set to extend its dominance with a 52% ...
Jen Klostermann

FinTech and Blockchain vs Traditional Banking

FinTech and Blockchain Growth "The Rise of FinTech - New York’s Opportunity for Tech Leadership", a report by Accenture and the Partnership Fund for New York City, reveals that global investment in FinTech endeavors has ...
Kayla Matthews

7 Technology Trends to Look for in 2020

Leading Tech Trends 2020 Cloud computing has become the norm. As of 2019, 94% of IT professionals were using the cloud in some form or another. This widespread adoption means that although it was once a ...
Automate Order Fallout Resolution Using Self-healing Framework

Automate Order Fallout Resolution Using Self-healing Framework

Automate Order Fallout Resolution Using Self-healing Framework to Accelerate Resolution Time by 98% Most Digital Service Providers (DSPs) face a common challenge of meeting due dates for their customer orders. The instability and delay in ...