2025AD Homepage

Why humans should take a back seat in driverless cars

What lessons can automated driving learn from aviation? (Photo: Adobe Stock)

Article Publication Meta Data

Henning Butz
Henning Butz
Show Author Information

Article Interactions

Rate this article on a scale of 1 to 5.
5 8 votes

As automated vehicles become more advanced, untrained drivers are best excluded from interacting with their car, says aviation industry veteran Henning Butz.

Automated driving may be a novelty to most, yet automated flying has been a standard for more than 20 years. Take a long-distance flight from Munich to Shanghai, for example. While the trip takes 11 hours, pilots fly the aircraft manually for as little as one minute on average. For the rest of the time, the plane flies automatically.

The autopilot kicks in shortly after take-off. En route, the aircraft is guided by the flight management system until the autopilot begins the descent into Shanghai Pudong Airport, supported by the airport’s instrument landing system. Once the aircraft has touched the tarmac, it is slowed down to a pre-programmed speed (“break to vacate”) so it can exit the runway on the pre-defined taxi way. In an ironic twist, the pilot then “drives” the plane to the gate.

Safer than flying the plane manually, such a level of automation also makes the journey more comfortable and more profitable. Yet this is a non-linear development; more automation does not automatically increase safety, comfort and profits. To achieve this, air travel needs:

  • a controlled and largely standardized airspace as implemented by traffic guidance systems and air traffic control
  • legally binding procedures for pilots and air traffic controllers, including continuous training, and
  • clear standards that define the interaction between automated aircrafts, guidance systems and standard operating procedures.

Without these or similar features, automated traffic systems are liable to suffer uncontrollable failures. Undefined traffic situations are likely to occur and operators are at greater risk of committing serious mistakes, causing a situation in which they are unable to re-establish the safe operation of their vehicle.

Complementary features

To close potential gaps, safety features also need to be redundant. This guarantees that air traffic as a system remains secure even if parts of it fail. The loss of contact with air traffic controllers, for example, does not impede a plane’s ability to fly safely, even in adverse weather, as on-board instruments and established cockpit procedures ensure pilots’ control of the aircraft. Consequently, an aircraft is at risk only when two safety features fail simultaneously:

  • flight systems and air traffic control
  • cockpit crew and flight systems
  • air traffic control and parts of the cockpit crew

For automated driving, this means that an autonomous car without external safety features can be detrimental to overall reliability if operators do not fully understand the state their vehicle is in. The problem is known as the “ironies of complex automation” and was widely researched in the early 1980s. Placing an untrained driver in a sophisticated automated car meets all aspects of the phenomenon.

The automated driving safety net

The human risk factor

So how can these insights from aviation help the automated driving community?

Much like airspace, ground traffic is governed by rules, standard procedures and is supported by features such as cruise control and distance meters. While the basic approach is similar to aviation, the level of standardization is lower and drivers have significantly more wiggle room for individual decisions. Additionally, drivers’ ability to handle their vehicle in a challenging situation will likely decrease as automated driving technology becomes more advanced.

This creates a significant risk during the transition phase from partially to fully automated driving: the interaction between unskilled humans and highly complex machines rather reinforces the ironies of automation and hence lowers the level of operational safety.

The automotive industry therefore faces significant challenges on the road to fully automated driving. As of today, responsibility for safe guarding the system’s integrity rests almost exclusively with the vehicle. There are no additional safety features such as traffic guidance systems or rigorously standardized operational procedures for drivers. Compared to aircraft manufacturing, the automotive industry is also subject to relatively few regulations. It is therefore crucial to establish additional safeguards as soon as possible.

We also should re-organize public space to increase safety. Standards and regulations need to be communicated to vehicles automatically so cars, not drivers, follow them. The automotive industry is well advised not to rely on measures that raise drivers’ skills to a level comparable to that of pilots. Rather, automated cars should offset the shortcomings of human operators (cognitive automation).

Automated cars should offset the shortcomings of human operators (Photo: Adobe Stock)

Close the front door

Untrained drivers should have no control whatsoever over highly complex means of transportation. The only safe alternative is to put drivers into the backseat where they can enjoy the ride as passengers. If automated driving is to be successful, it is crucial to establish a “closed cockpit” where passengers have no influence on the vehicle’s operation.

The success of automated driving will hinge on automated systems’ ability

  • to compensate for human error,
  • to establish a closed cockpit
  • and to align the above with a largely standardized environment.

As soon as automated vehicles gather input to act independently, their interpretation of a given scenario will become incomprehensible to humans. Translating the machine’s behavior into a language that we can understand will only be possible at the beginning of automated driving – and even then, it will be an onerous and error-prone task.

Out of the loop

Still, the machine’s strategy will be optimal in most scenarios and certainly better than a human driver’s approach. We are therefore well-advised to limit human intervention. The man-machine interaction should be confined to entering a destination and a time of departure or arrival. Keeping humans in the loop of a deep-learning automated vehicle is next to impossible anyway. It will be our task to think about the consequences this will have on our ethics and norms.

Article Interactions

5 8 votes
Rate this article on a scale of 1 to 5.

Article Publication Meta Data

Henning Butz
Henning Butz
Show Author Information

Related Content