2025AD Homepage

The crash and the consequence: What the fatal Tesla accident means for automated driving

Shattered dreams? The first fatal Autopilot car crash may impact a whole industry. (Photo: Reuters)

Article Publication Meta Data

Kate Mann
Kate Mann
Show Author Information

Article Interactions

15 shares
Rate this article on a scale of 1 to 5.
4.08 38 votes
0 comments
comment
0 views

The first fatal crash of a vehicle in Autopilot mode raises questions for the industry, for legislators - and for the general public. An analysis.

Today, as news of the first fatality in a self-driving vehicle breaks, the auto industry finds itself at a point of reflection. In May 2016, a Tesla Model S electric sedan car set to Autopilot collided with a truck in Florida. Whilst the truck driver survived, the Tesla driver, Joshua Brown, did not. With a formal investigation into the incident now underway in the U.S., the crash has become public, and with that so too has the debate on its consequences. Marking a significant and tragic moment for the development and testing of automated vehicles, we take a look at the issues it raises – for Tesla, for the car industry, for the general public and for legislators.

Partial automation - partially safe?

At the center of the investigation is the Autopilot function launched by Tesla in October 2015. Long seen as a trail-blazer when it comes to driverless tech, the company released the software in a public beta phase. From the outset Tesla have cautioned drivers about its use. In particular, they have stressed that the driver cannot abdicate responsibility when using the Autopilot function and has to keep their hands on the steering wheel at all times. 

Despite the warnings, however, many drivers were keen to test out the function – and sometimes dangerously pushing its limits – with several publishing well-watched videos on YouTube. It is a tragic fact that the crash victim, Joshua Brown, had himself posted a video in April, which showed how his Autopilot helped avoid a collision on a highway (and Tesla boss Elon Musk retweeted it to millions of followers).

This general curiosity and excitement can be seen to reflect the fact this technology is completely new to both drivers and companies.

Referred to as partial automation, or Level 2 to those in the know, when set to Autopilot, a car is operating half-way between manual and driverless. Still relying on human intervention, partial automation is often viewed as a transitional phase in the automation process.

This middle ground has raised a number of questions (both now and before) regarding the extent to which humans are able to remain concentrated and focused if they have the possibility to let go of the steering wheel. Will humans always be tempted to try out any functionality in their hands, even beyond its permitted usage? 

According to the National Highway Traffic Safety Administration, preliminary reports indicate that the crash occurred when a tractor-trailer made a left turn in front of the Tesla, and the car failed to apply the brakes. Tesla themselves have stated the crash most likely occurred as “Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.” This statement suggests a number of factors were responsible for the crash. But whatever the outcome of the investigation, the incident will most certainly spark a new discussion on semi-automatic solutions. 

Questions about the safety of releasing beta versions of such technologies will also now be asked, as some may assume if something is available, it is “ready to go.”

Public acceptance: What is automated driving worth to society?

The larger question is of course: Will the incident hinder public acceptance of driverless cars? For car companies trying to reassure people about the safety benefits of self-driving technology, this event might come as a particular blow. As the Guardian puts it, the accident happens “at a time when Americans have just started to become more comfortable with letting machines take the wheel.” The paper argues that the new development “is sure to cause consumers to second-guess the trust they put in the booming autonomous vehicle industry.”

For others, however, a fatal self-driving accident was inevitable. In an interview with 2025AD in March, Armin Grunwald, technology assessor for the German parliament, told us that he had “no doubt: at some point, this complex technology in some specific situation will have a malfunction and cause an accident. And people will be harmed by it. This will spur debates.” However, Grunwald also suggested that the impact of such an event depends how you look at it: “Currently, humans are responsible for more than 90 percent of all accidents. If autonomous driving can reduce a large proportion of those road deaths, the public will accept it. Even if the systems fail occasionally, it is not likely that people will reject the technology as a whole,” he said.

Tesla used a similar argument in their response to the accident, highlighting that “This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles.”

However, as nothing helps skepticism more than an incident that confirms suspicions, it could take some time for people to see this bigger picture again.

Legislation: Is Europe taking the safer road?

A final issue that will come into question is legislation. As driverless technology advances, the differences in U.S. and European legislation have become increasingly apparent. In June, we reported that America’s bold approach to regulation could see Europe left in their wake. Whilst in Europe, legalization for automated driving is coming in small steps, it is clear that the technology is being driven forward much faster by the U.S. government. As automated steering systems in Europe are restricted to low speed maneuvering capped at 10km/h, accidents are certainly less likely here than in the U.S. While the cautious approach of European legislators has often been criticized in the past, it might not look like such a bad idea anymore in light of the recent development.

The legal implications of this current case in the U.S. are unknown, however with both the investigation and the legislation being the responsibility of National Highway Traffic Safety Administration (NHTSA), it could raise some conflicts of interest.

What’s next: Lessons from the crash

For many, the incident will serve as a warning, demonstrating that improvements and advancements still need to be made - and that warnings and restrictions issued by carmakers need to be taken seriously. One thing seems certain: The reactions to the fatal crash will show just how willing both consumers and lawmakers are to trust in future technologies. Will they keep looking at statistics that clearly show how automation reduces accidents overall? Or will the fatal accident spark new fear of putting one’s life into the hands of a machine? Elon Musk and his team will certainly hope for the former as they express their condolences:

Article Interactions

15 shares
4.08 38 votes
Rate this article on a scale of 1 to 5.

Article Publication Meta Data

Kate Mann
Kate Mann
Show Author Information

Related Content

Quick Newsletter Registration!

Subscribe to our monthly newsletter to receive a roundup of news, blogs and more from 2025AD directly to your mailbox!