Lessons from aviation: “Carmakers should eliminate level 3 in automated driving!”
Technology and Business
In civil aviation, automation has reduced workload and stress for pilots, while making flying one of the safest ways to travel. Can these safety lessons apply to autonomous cars – especially with regard to how we as drivers will interact with them? We spoke to a human factors expert to get answers.
When Captain Marc Dubois stumbled into his cockpit after a quick nap, he was met with a scene of chaos. His aircraft was shaking so heavily it was impossible to read its instruments. Jarring alarms. The shrieking voices of his two junior co-pilots. A plummeting altitude.
Based on the final report, this is how one might imagine the last moments preceding one of the most harrowing accidents in aviation history. It’s an important case that is still taught to freshman pilots: In the late evening of June 1, 2009, Paris-bound Air France flight 447 disappeared into the pitch-black Atlantic Ocean, just four hours after its on-time take-off in Rio. But what happened?
According to what’s known today, the plane entered what aviation calls a stall. Navigating a tropic thunderstorm, the plane’s computer received conflicting airspeed readouts due to disabled sensors, causing it to pull up to gain height – and then disengage autopilot. For Dubois and his crew, it was high time to take action. Fatally, they had no clue what was going on.
A rare occurrence of automation tech failing – or just human error? Or perhaps both?
AVIATION: THE ACCIDENT RATE WENT DOWN TO ALMOST ZERO
In reality, when automation met aviation, it seemed like a match made in heaven. “The aviation industry has been increasing automation in aircraft cockpits since the 1970s,” says Dr Michael Clamann, Senior Research Scientist at Duke University. “It was intended to reduce mental workload for pilots." Typically, today’s aviators touch the controls only occasionally during the flight (if at all). Mostly, the airborne computer navigates the aircraft, while the pilot turns into its quiet supervisor.
The result? Automation has helped make air travel significantly safer. An Airbus study shows the yearly accident rate per million flights has dropped to almost zero. While carriers see automation as one of the last decades’ greatest achievements, the reliance on it creates challenges – for today’s pilots and tomorrow’s drivers of automated cars. In fact, the aviation industry has been exploring the effects of automation on pilots for years. Why should the auto business learn from aviation’s experiences? When it comes to automation, the two industries share the same goal: preventing accidents from happening – by eliminating human error.
FLYING AND DRIVING ARE DIFFERENT – BUT THE EFFECTS OF AUTOMATION ARE NOT
It might seem like an apples to oranges comparison at first sight: trained professionals with up-to-date licenses who prep every trip down to the finest detail versus laymen with private licenses (sometimes from decades ago), who erratically navigate city traffic.
Yet with regard to automation, Clamann sees parallels. “We’re not talking about a comparison of aviation and driving. It’s about the similar effects that automation has on the human operators in both modes of transport.”
LEARNINGS FROM THE AF 447 CRASH: FLAWLESS HUMAN-MACHINE INTERACTION WILL SAVE LIVES
We know it from soccer: a goalkeeper and a defender from the same team are blindly diving for the ball – with both players accidentally interfering. A case of ill-defined responsibilities resulting in things going awry. With pilots and drivers interacting with automated systems, it could look similar. Human factors research refers to this as mode confusion.
“If a person expects that the system is in charge, it’s very difficult for them to figure out what’s going on,” says Clamann. Especially in emergencies, flawed human-machine interaction (HMI) can be critical. And according to Clamann, that’s what led to the AF 447 crash: Neither did the machine fully diagnose disorders in its speed sensors, nor did it manage to conclude what must be done to rectify them – for example, by presenting the situation to the human operator correctly and requesting him to take over.
COMPLACENCY AND SKILL DECAY: IT’S HUMAN NATURE – AND PILOTS ARE NO EXCEPTION
Man is a creature of habit, they say. Things that we start pursuing with focus become a part of our daily routine. We become complacent. Yet in complex tasks like operating an aircraft, complacency can be a curse. “Anytime we focus on a task like operating a system and we’re not constantly interacting with it, we zone out after about 20 minutes,” says Clamann. “It’s human nature – and pilots are no exception.” It then takes us a while to regain full attention – and even longer to regain vehicle control. Whether it’s 30,000 feet above sea level or on a crowded city road during rush hour, this lag can be too long. A point in case: the fatal 2016 Tesla crash, where the driver overestimated the car’s Level 2 system, misleadingly called ”Autopilot”.
Skill comes with practice: That is why pilots are spending long hours in flight simulators – not only to train new situations, but to retain their skills. Is there reason to believe these could degrade? “Doing something actively is not the same as passively watching it being done by automation,” says Clamann. “We’re very likely to see skill decay in automated driving – especially in Level 4,” where vehicles are expected to be able to drive themselves with minimum human input, but still may have to rely on a human to step in in certain moments.
WHAT HMI MUST DELIVER – AND HOW IT COULD KEEP DRIVERS IN THE LOOP
Most of aviation’s safety mechanisms (training, medical checks) seem unrealistic for automated driving (who’s up for regular eye tests, anyone?). That is why the effects of automation could become prevalent in self-driving mobility. “Level 3, especially, is a terrible idea – and carmakers should eliminate it,” says Clamann. In his view, driverless tech is far from being ready to let the average driver lean back, relax and watch a movie. “Unfortunately, that’s the scenario we’re looking at with semi-automation,” he explains. The solution? “It will all come down to creating interfaces that literally work with the operator – proactively and intuitively”. Only then could people be kept from overtrusting automation and maintain a certain level of vigilance.
To help the driver understand the system and to assist the system in knowing its operator, Joe Klesing, executive director at supplier Nexteer, suggests a combination of smart interfaces. In-vehicle cameras, for instance, could track driver fatigue to determine whether he is ready to take over – and emit visual or auditory warnings if necessary. “Especially audible alerts have shown to be more effective than exclusively visual ones,” says Clamann. Orders (“Hit the break”) would be more effective than lengthy instructions. According to Klesing, head-up displays could help drivers focus on an approaching hazard. The steering wheel could retract to indicate that AD is enabled. If the driver wants to resume control, he must pull it towards him to actively re-take control.
It’s the old question: Does adding more automation to a system automatically make it safer? According to Clamann, automation does one thing: It changes the system, and with that, the way we interact with it. In aviation, the increase of autopilot in cockpits has been accompanied by measures to help pilots understand it – and setting cornerstone cases like AF 447 aside, anticipate its unwanted effects.
Is the auto business going that extra mile to ensure we will understand the decisions our automated cars make while we cruise in them? In fact, manufacturers find themselves in a hot pursuit over who will deliver “safe” (semi-)automated vehicles first – with marketing departments doing a pretty good job in magnifying new features to fuel the autonomous dream. Yet what’s easily forgotten amidst this craze: Too often producers deliver an automated driving feature without delivering the appropriate user-centric know-how about its capabilities.
Earl Wiener, a cult figure in aviation safety, coined what we refer to today as Wiener’s Laws of Aviation. It’s a set of tenets, whose insight could apply far beyond aeronautics. Law #28: Any pilot who can be replaced by a computer should be. Looking at aviation’s safety track record, this may be a fitting guiding principle for the development of automated cars.
ABOUT OUR EXPERT:
Michael Clamann is a senior research scientist with the Humans and Autonomy Lab (HAL) at Duke University. He received a Ph.D. in Industrial and Systems Engineering at North Carolina State University in 2014. He received a M.I.E. in Industrial and Systems Engineering and a M.S. in Experimental Psychology from North Carolina State University in 2011 and 2002, respectively. He is an Associate Director at the Collaborative Sciences Center for Road Safety (CSCRS) and is the lead editor of Robotics and AI for the Duke Initiative for Science & Society's Policy Tracking Program (SciPol). His research interests include human-automation interaction, multimodal control and issues at the intersection of technology and society. He has worked in industry as Human Factors Engineer since 2002, supporting government and private clients in domains including aerospace, defense and telecommunications.
Should Level 3 automated driving be eliminated for the sake of road safety? And which solutions do you think would work best to ensure future drivers of automated vehicles will stay vigilant? Share your thoughts in the comment section!
Submit your story
Become part of our autonomous revolution and submit your stories, images and videos
Stay up to speed with our weekly briefing. Enjoy autonomous driving content direct to your inbox