November 20, 2023

Q&A Daniel Barber – 2024 AI + Data Privacy Predictions

By Randy Ferguson

2024 AI + Data Privacy Predictions

In a recent interview with CloudTweaks, Daniel Barber, Co-Founder and CEO of DataGrail, shared insightful perspectives on the evolving landscape of AI and privacy. Barber emphasizes the importance of cautious optimism regarding AI, noting the technology’s potential as an innovation accelerator while also acknowledging the challenges in claiming complete control over it. He also highlights the need for robust discovery and monitoring systems, and governance to ensure responsible AI usage.

AI and Control Dilemmas – Given the current state of AI development, where do you stand on the debate about controlling AI? How do you balance the need for innovation with the potential risks of unintended consequences in AI deployment?

Anyone promising complete control of AI shouldn’t be trusted. It’s much too soon to claim “control” over AI. There are too many unknowns. But just because we can’t control it yet, doesn’t mean you shouldn’t use it. Organizations first need to build ethical guardrails– or essentially adopt an ethical use policy– around AI. These parameters must be broadly socialized and discussed within their companies so that everyone is on the same page. From there, people need to commit to discovering and monitoring AI use over the long-term. This is not a switch something on and forget it situation. AI is evolving too rapidly, so it will require ongoing awareness, engagement, and education. With precautions in place that account for data privacy, AI can be used to innovate in some pretty amazing ways.

 AI as a Privacy Advocate – With regards to the potential of AI as a tool for enhancing privacy, such as predicting privacy breaches or real-time redaction of sensitive information. Can you provide more insights into how organizations can harness AI as an ally in privacy protection while ensuring that the AI itself doesn’t become a privacy risk?

As with most technology, there is risk, but mindful innovation that puts privacy at the center of development can mitigate such risk. We’re seeing new use cases for AI daily, and one such case could include training specific AI systems to work with us, not against us, as their primary function. This would enable AI to meaningfully evolve. We can expect to see many new technologies created to address security and data privacy concerns in the coming months.

Impact of 2024 Privacy Laws – With the expected clarity in privacy laws by 2024, particularly with the full enforcement of California’s privacy law, how do you foresee these changes impacting businesses? What steps should companies be taking now to prepare for these regulatory changes?

Today, 12 states have enacted “comprehensive” privacy laws, and many others have tightened regulation over specific sectors. Expect further state laws—and perhaps even a federal privacy law—in coming years. But the legislative process is slow. You have to get the law passed, allow time to enact it, and then to enforce it. So, regulation will not be some immediate cure-all. In the interim, it will be public perception of how companies handle their data that will drive change.

The California law is a good guideline, however. Because California has been at the forefront of addressing data privacy concerns, its law is the most informed and advanced at this point. California has also had some success with enforcement. Other states’ legislation largely drafts off of California’s example, with minor adjustments and allowances. If companies’ data privacy practices fall in line with California law, as well as GDPR, they should be in relatively good shape.

To prepare for future legislation, companies can enact emerging best practices, develop and refine their ethical use policies and frameworks (yet make them flexible enough to adapt to change), and engage with the larger tech community to establish norms.

More specifically, if they don’t already have a partner in data privacy, they should get one. They also need to perform an audit on ALL the tools and third-party SaaS that hold personal data. From there, organizations need to conduct a data-mapping exercise. They must gain a comprehensive understanding of where data resides so that they can fulfill consumer data privacy requests as well as their promise to be privacy compliant.

The Role of CISOs in Navigating AI and Privacy Risks – Considering the increasing risks associated with Generative AI and privacy, what are your thoughts on the evolving role and challenges faced by CISOs? How should companies support their CISOs in managing these risks, and what can be done to distribute the responsibility for data integrity more evenly across different departments?

It comes down to two primary components: culture and communication. The road to a better place starts with a change in culture. Data security and data privacy must become the responsibility of every individual, not just CISOs. At the corporate level, this means every employee is accountable for preserving data integrity.

What might this look like?

Organizations might develop data accountability programs, identifying the CISO as the primary decision maker. This step would ensure the CISO is equipped with the necessary resources (human and technological) while upleveling processes. Many progressive companies are forming cross-functional risk-councils that include legal, compliance, security and privacy, which is a fantastic way to foster communication and understanding. In these sessions, teams surface and rank the highest priorities of risk and figure out how they can most effectively communicate it to execs and boards.

Comprehensive Accountability in Data Integrity – The importance of comprehensive accountability and empowering all employees to be guardians of data integrity. Could you elaborate on the strategies and frameworks that organizations can implement to foster a culture of shared responsibility in data protection and compliance, especially in the context of new AI technologies?

I’ve touched on some of these above, but it starts with building a culture in which every individual understands why data privacy is important and how data privacy fits into their job function, whether it’s a marketer determining what information to collect, why, and for how long they will keep it, under what conditions, or it’s the customer support agent who collects information in the process of engaging with customers. And of course privacy becomes central to the design of all new products; it can’t be an afterthought.

It also means carefully considering how AI will be used throughout the organization, to what end, and establishing ethical frameworks to safeguard data. And it may mean adopting privacy management or privacy preserving technologies to be sure that all bases are covered so that you can be a privacy champion that uses data strategically and respectfully to further your business and protect consumers. Those interests are not mutually exclusive.

By Randy Ferguson

Randy Ferguson

Randy boasts 30 years in the tech industry, having penned articles for multiple esteemed online tech publications. Alongside a prolific writing career, Randy has also provided valuable consultancy services, leveraging a deep knowledge of technological trends and insights.

PODCAST SERIES

SPONSOR PARTNER

Explore top-tier education with exclusive savings on online courses from MIT, Oxford, and Harvard through our e-learning sponsor. Elevate your career with world-class knowledge. Start now!
© 2024 CloudTweaks. All rights reserved.