A few weeks ago I wrote a myth-busting post to tell the truth about using AI in industrial settings. But why is this coming up? What is different about industrial situations, and why can’t we use the cool AIs there?
The data itself is difficult
Industrial data is very lumpy and difficult to stitch together. AIs don’t operate well when the data come from many different types of sources. The need for a clean data domain in which all the predictors are accessible, normalized, and recognizable in the data practically rules out industrial settings. For example, can you still tell what device or vehicle or machine this data was extracted from, and what its location was at the time, let alone all the other information relevant to deciding its state (weather, temperature, history of repairs)? The output of assembling and stitching all that data together, along with an ontology that makes sense of it all, is what we call the digital thread – and it’s critical to creating a solid data foundation.
Access to the data may be blocked
In the years since the first, second, and third industrial revolutions, the techniques, protocols, formats, and storage mechanisms of the data have changed dramatically. The systems and equipment are sitting behind firewalls, or in air-gapped buildings, or disconnected in combat zones, or in otherwise un-serviced areas. And if physical access is not the problem, then often getting security or permission to acquire the data is.
The predictions, if wrong, can be deadly
AI gets it wrong frequently (e.g., YouTube’s fact-checking mislabeled the Notre Dame fire as a possible terror attack). When this happens on YouTube it can be inflammatory, insensitive, and potentially riot-inducing. When AI-aided automation performed in industrials goes wrong, it can be truly sensitive, dangerous, or deadly, as illustrated by the disastrous Boeing Air-Max situation. Lengthy governance and certification processes beyond the development of automation is required. For this reason, our foray into AI for industrials must be measured, and careful…