Autonomous cars covered over 640,000 test miles on Californian roads in 2015. Eleven developers released car testing reports. Google’s Waymo completed 636,000 miles. GM’s Cruise, Nissan, and Delphi covered over 3000 miles. Others included VW/Audi, Mercedes-Benz, Tesla, Bosch, BMW, Honda, and Ford. All reports show similar reasons why drivers take manual control – an event known as ‘disengagement’ or ‘DE’ in the trade. The accumulated reports were released by the State of California Department of Motor Vehicles.
Streetwise or not?
The summarised reports show that complex road situations on urban streets provided the most common reasons for test engineers to grab the wheel. This included traversing roadworks, changing lanes, and overtaking. Google’s car, which was tested most extensively, most often reverted to manual when about to perform unexpected and unwanted manoeuvres, or as a remedy for apparent software discrepancies.
It is laudable that companies like Google prefer to test their autonomous vehicles in complex urban environments, rather than on more straightforward freeways. Some 89 percent of all Google disengagements were in cities. Less laudably, the reasons for disengagement raise concerns about how well autonomous cars’ software and sensors cope with urban conditions.
|Autonomous car disengagements|
|Miles covered||Miles per disengagement in 2015||Common causes|
|Waymo (Google)||635,868||1244.4||Software discrepancy; unwanted vehicle manoeuvre|
|Mercedes-Benz||673.4||1.8||Driver discomfort; planned technology tests|
|Delphi||3125.3||41.9||Changing lanes in heavy traffic; traffic light detection|
|Tesla Motors||550||N/A||Invalid planner or follower output|
|Bosch||983||1.5||Planned technology tests|
|Nissan||4099||14||System failure; imminent collision|
|Cruise (GM)||9846.5||N/A||Remedying unexpected behaviour|
|BMW||638||N/A||Unclear lane markings|
|Ford||590||N/A||Aborted lane change during high speed overtaking|
Comparing apples and oranges
The Google report reveals that, of 69 reported safe operation events in 2015, 13 were simulated contacts. In plain language this means the vehicle was on its way to collide with another object unless the driver had taken manual control. The other 56 events could also have led to contact with another object, though less imminently.
The figures show that autonomous cars still require close supervision by a human driver. To be fair, research needs to provide comparisons with manually driven cars. Any such research would be untenable, however, as it would compare the proverbial apples with oranges. Human thought processes are intrinsically unsuitable for comparison with the algorithms that operate autonomous cars.
A researcher can collect, assess, and analyse data on both human driven and autonomous cars, but cannot correctly address factors like emotion, intuition, or human experience accumulated over eons of evolution. An experienced driver can anticipate the behaviour of a car in the adjacent lane fairly quickly. Algorithms are still a long way from being able to do so.
This is reflected in Google’s disengagements figures. Between September 2014 and November 2015 ‘perception discrepancy’ caused 199 disengagements. Next came software discrepancy, with 80 events. Hardware issues aside, there were 55 disengagements because the car began to manoeuvre in unexpected ways, and 23 to accommodate reckless behaviour by other road users.
Ready for adoption: not yet
The reports in effect reveal that autonomous vehicles cannot yet handle complex road situations and are apt to make unexpected moves that are potentially dangerous for their occupants and other road users. A commonly cited complex situation involves roadworks. These feature multiple and diverse road signs and instructions, which can totally overcome self driving cars’ control algorithms. As any driver knows all too well, roadworks are very common. Yet, it emerges that no programmed autonomous driving solution can cope with them at the moment.
The only scenario that would work is fairly utopian. It involves only autonomous cars, each with fully compatible sensors, and each able to communicate and with all the others, instantly. It would be unkind to state that autonomous car research to date has resulted in something akin to air traffic control or railway signalling.
Other constraints to autonomous car adoption include allowances for emergency vehicles and the cars’ current inability to anticipate other road users’ moves. An accumulation of more than one such factor could result in a hazardous situation where drivers have to take control. What is more, today’s state of the art autonomous cars need their derivers to be alert and to second-guess the technology at all times. Parkinson’s Law states that they would be apt to undertake unexpected manoeuvres exactly when the driver is relaxed or distracted and is thus unable to take the wheel.
The incidence of disengagements has fallen significantly over the past year. Google reports 0.8 DAs per thousand miles in 2015 and just 0.2 last year. All the same, the technology is still very much in gestation, and the figures pinpoint the areas all manufacturers ought to take to heart when honing their autonomous cars.
By Kiril V. Kirilov
Kiril V. Kirilov is a content strategist and writer who is analyzing the intersection of business and IT for nearly two decades. Some of the topics he covers include SaaS, cloud computing, artificial intelligence, machine learning, IT startup funding, autonomous vehicles and all things technology. He is also an author of a book about the future of AI and Big Data in marketing.