photo: https://istock.com

The fatal driverless crash: an update on the March 2018 incident

Private Life and Mobility

Raven Brookes

21-11-2019

       

New evidence shows the pedestrian killed during the self-driving Uber crash in 2018 'probably would have lived' if the automatic braking feature hadn’t been shut off.

photo: https://istock.com

New documents from the National Transportation Safety Board, released on the 5 November 2019, have revealed that the automatic braking feature that comes as standard with the car model in question, were shut off by Uber to avoid 'interference' with its self-driving testing on public roads.

 

The first driverless fatality

As we reported shortly after the incident took place, on 18 March 2018 the 49-year-old Elaine Herzberg was hit and killed by an Uber-operated Volvo XC90 as she pushed her bicycle across a dark and busy intersection in Tempe, Arizona. It was the first time a pedestrian had been killed

 

The car had been retrofitted with driverless cameras, sensors and software by Uber and was in the process of being tested. It failed to slow down or stop as the woman crossed, and the 'safety driver' was unable to act in time. The crash led to Uber immediately halting their testing of driverless vehicles in Arizona, San Francisco, Toronto, and Pittsburgh, with a full enquiry launched.

 

The driver, and Uber, were quickly cleared of any wrongdoing when it quickly came to light that it was a software failure that led to the woman not being 'seen', or at least, not being 'seen' as a pedestrian.

photo: shuttershock.com

The new NTSB findings

Further details have emerged since, wherein it became apparent that the system didn't realise pedestrians could 'jaywalk', meaning 'cross illegally'. Elaine crossed 100 yards from a 'crosswalk', which explains why the car failed to detect her correctly despite having 5.6 seconds to react. It identified her as another vehicle, and not a pedestrian.

 

The NTSB investigators have since released information which has turned some of the attention back to Uber. Most modern cars, including the XC90, have a factory-equipped automatic emergency braking feature which kicks-in in situations where the driver is distracted, or fails to break in 'good time'. This was, ahead of the driverless testing phase, overridden by Uber so it wouldn't interfere with the system, or the testing.

 

Volvo have since run tests which emulated the circumstances of the collision and have found its system should have prevented, or at least lessened, the impact, as the NTSB reported: “[the SUV] would have avoided the collision with the pedestrian in 17 out of 20 variations — the pedestrian would have moved out of the path of the SUV.”

 

However, Uber were more than likely right to shut off this system. Experts consulted on the case have since said it made 'technical sense', as it would have been "unsafe for the car to have two software 'masters'."

 

So yes, this ultimately means the driverless system was at fault. Not Uber, and not the safety driver – who was cleared when it was determined that Herzberg 'appeared from the shadows' in such a way that reacting in good time would have been very difficult for her.

photo: www.shuttershock.com

The aftermath

Mistakes have been made, and lessons have been learned. New software has been developed – by Uber, by Volvo and by other driverless tech companies – to ensure humans and vehicles have clear distinctions, no matter how they use the roads. But there's still the matter of public trust to settle.

 

According to Wired:

 

"On November 19, the NTSB will hold a meeting in Washington, DC, on the incident. Investigators will then release a comprehensive report on the crash, detailing what happened and who or what was at fault. Investigators will also make recommendations to federal regulators and to companies like Uber building the tech outlining how to prevent crashes like this in the future."

 

Aside from the Tesla ‘autopilot crash’ of 2016, which killed the driver, Herzberg was the first and, to date, the only  'uninvolved' victim of a self-driving vehicle. But, as public road testing is stepped up, the potential for more is statistically increasing. But does that mean autonomous driving will be abandoned as a possibility?

 

In the original article about the crash, we called upon a previous 2025AD interview with Professor Armin Grunwald, a renowned technology assessor for the German parliament. He said:

 

“At some point, this complex technology in some specific situation will have a malfunction and cause an accident. And people will be harmed by it. This will spur debates. Then we will need to look at the overall statistic: currently, humans are responsible for more than 90% of all accidents. If autonomous driving can reduce a large proportion of those road deaths, the public will accept it. Even if the systems fail occasionally, it is not likely that people will reject the technology as a whole.

 

Grunwald went on to predict: “if autonomous driving can reduce a large proportion of those road deaths, the public will accept it.”

 

Join the debate! Is this likely to happen again, or have the driverless technology companies learned their lesson?

Engage

<div id="hs_cos_wrapper_Engage_" class="hs_cos_wrapper hs_cos_wrapper_widget hs_cos_wrapper_type_inline_text" style="" data-hs-cos-general-type="widget" data-hs-cos-type="inline_text" data-hs-cos-field="submit_your_story.icon_text">Letter</div>
Submit your story

Become part of our autonomous revolution and submit your stories, images and videos

Submit your story
Megaphone
Stay informed

Stay up to speed with our weekly briefing. Enjoy autonomous driving content direct to your inbox

Join our weekly briefing
Connect
Connect with us

Follow us on our social networks for up to date information and thoughts on automated driving