How Formal Verification Can Thwart Change-Induced Network Outages

How Formal Verification Can Thwart  Breaches

Formal verification is not a new concept. In a nutshell, the process uses sophisticated math to prove or disprove whether a system achieves its desired functional specifications. It is employed by organizations that build products that absolutely cannot fail. One of the reasons NASA rovers are still roaming Mars years after their intended lifespan is because the correctness of their software was mathematically verified before deployment. Similar trusted 24/7/365 technology is embedded into mission-critical airplane flight controls, medical devices, and military defense systems that are too important to malfunction.

Recently, a team of computer science professors and Ph.D. students in the EnterpriseWorks incubator at the University of Illinois at Urbana-Champaign (UIUC) discovered how that same methodology can be applied to bulletproof today’s most complex networks to help prevent change-induced outages and breaches. Mathematical network verification is long overdue. Even if only 2% of modifications to a network’s configuration result in a change-induced outage or vulnerability, let’s put that in perspective: would you board an airplane if you knew that two out of 100 planes could fall out of the sky? Of course not. Why should we expect less from our networks? With so much sensitive data at stake, networks have to be as trustworthy as mission-critical systems and infrastructure.

Why Networks Fail

network

Four key factors have made network infrastructure particularly vulnerable to breaches and outages. First, when you factor in the cloud, virtualization, the move to software-defined networks (SDN), and mobile and IoT devices, you’ll quickly see that networks have become incredibly complex to manage. Second, every network change leaves an opening for something to possibly be misconfigured. By some estimates, operators of large enterprise or service provider networks make approximately 1,000 changes per month. Third, humans are a constant and unpredictable factor. Gartner analyst Donna Scott has noted that “80 percent of unplanned downtime is caused by people and process issues, including poor change management practices, while the remainder is caused by technology failures and disasters.”

And finally, there is poor policy management. According to the consulting firm Protiviti, one out of three enterprises and Service Providers lacks policies for IT, information security and data encryption, while 71 percent lack critical knowledge of which policies to institute to mitigate vulnerabilities and disruption. The fact is, most enterprises do not know what their network actually looks like at the deep infrastructure level, whether it is operating as it should be, and what Vulnerabilities are lurking in the network. As a result, making any change to the network – even a day-to-day modification like changing access control rules or adding a device – is a time-consuming, manual and risky process. And broad architectural changes such as moving to a hybrid cloud or deploying SDN can be daunting projects.

How Formal Verification Works

Formal verification tries to predict the future: will my design work when I deploy it in the field? Will it always work, no matter what unexpected inputs or attacks are thrown at it? For example, a software application designer might want to know that her code will never crash or that it will never execute certain functions without an authorized login. These are simple, practical questions – but answering them is computationally challenging because of the enormous number of possible ways code may be executed, depending on the inputs and environments it interacts with. Decades of research advances in formal verification algorithms have led to the ability to rigorously answer such questions for certain kinds of software and critical systems.

Can we understand the behavior of complex enterprises with the same mathematical rigor? In a network, we want to understand whether the design meets network-wide data-flow policy: Is my network correctly segmented to protect against lateral movement by attackers inside my perimeter? Will my web services be available even after a router interface failure? Today, these questions are addressed through manual spot-checks, auditing and change-review boards that might take weeks to provide answers, all of which still leave plenty of room for error. In contrast, mathematical network verification reasons automatically about the possible behaviors of the network.

Achieving that analysis required several innovations:

  • First, it required rigorous understanding of devices’ data-flow behavior, down to the data plane instructions, which are the “machine code” of the network.
  • Second, it required sophisticated, novel reasoning algorithms and compact data structures that can efficiently explore the exponentially large number of possible packets that may be injected at thousands or tens of thousands of devices and ports in the network.

Using mathematical network verification can help enterprises prevent the outages and breaches that lead to astronomical losses, both informational and financial. Unlike techniques such as penetration testing and traffic analysis, mathematical network verification performs exhaustive analysis of all possible data-flow behavior in the network, before it happens – before vulnerabilities can be exploited, and without waiting for users to experience outages. If there is a network policy violation, verification will find it and provide a precise identification of the vulnerability and how to fix the flaw. The underlying technology allows for millisecond-level analysis of security policies, enabling real-time alerting and policy enforcement, and can provide mathematical evidence that the network is correct, giving organizations the confidence to implement changes to their infrastructure.

Real-time Situational Awareness

Depending on how an organization applies mathematical network verification to its network, the technology can collect a real-time situational awareness of a network’s data-plane state (the lowest and most foundational information in a network device), develop a flow model of the entire network and perform a rigorous analysis of security policies in real time to detect vulnerabilities and policy violations.

In a world where a network intrusion can result in billion-dollar losses, brand damage and outages of critical functionality, mathematical network verification represents a significant step towards improving overall network health and preventing network outages and breaches. It is the only technology available today capable of providing rigorous, real-time analysis at the deep data-plane level of a complex network, and it produces a level of confidence not attainable by other approaches.

Mathematical network verification is a unique technology that has already demonstrated success in multiple Fortune 500 and Government networks. We will likely see quick adoption of the technology when organizations discover how quickly and dramatically it improves network security and resilience, making change-induced breaches and outages a thing of the past.

By Dr. Brighten Godfrey

Bigcommerce

Magento 1 Is Nearing Its End – Is It Time To Migrate To BigCommerce?

Time To Migrate To BigCommerce? Nearly three years ago, Magento declared that they would be ending support for their Magento 1 software. All versions of Magento from 1.1 – 1.9 would then work without maintenance, ...
Anita Raj

A Winning Data Strategy Series Part 4: Are Your Data-driven Insights Driving Business Results?

Are your data-driven insights driving business results? This is the fourth piece of a 5-part series on plugging the obvious but overlooked gaps in achieving digital success through a refined data strategy. How would you ...
Anita Raj

A Winning Data Strategy Series Part 2: Data, an Asset, or a Liability?

Data, an Asset, or a Liability? This is the second piece of a 5-part series on plugging the obvious but overlooked gaps in achieving digital success through a refined data strategy. You can read the ...
Derrek Schutman

Providing Robust Digital Capabilities by Building a Digital Enablement Layer

Building a Digital Enablement Layer Most Digital Service Providers (DSPs) aim to provide digital capabilities to customers but struggle to transform with legacy O/BSS systems. According to McKinsey research, 70% of digital transformation projects don’t ...
Gary Bernstein

How IoT (Internet of Things) Cloud Services Stack Up Against DIY

How IoT Cloud Services Stack Up Against DIY The ever-increasing mass adoption of IoT devices is leading to a consistent rise in the data generated through these devices. Large corporations are cashing on this opportunity ...
Big Data Explosion

Developing Machine Learning-based Approach for Optimizing Virtual Agent (VA) Training

Optimizing Virtual Agent (VA) Training Achieve NLU model’s precision, recall & accuracy up to 78% The success of any Virtual Agent (VA) depends on the training of its Natural Language Understanding (NLU) model prior to ...