2025AD Homepage

How Uber fell into the complacency trap of automated driving

Uber will hire so-called "mission specialists" to operate their autonomous cars. (Photo: Uber)

Article Publication Meta Data

Stephan Giesler
Stephan Giesler

Article Interactions

5 shares
Rate this article on a scale of 1 to 5.
5 2 votes
0 comments
comment
0 views

Uber leaves many questions unanswered, autonomous trucks might save lives and massive cyber-attacks are looming: we bring you this week’s key stories from the world of automated driving!

“Man is a creature of habit, they say.” That’s a line we wrote last year in an article that detailed the vicious trap that is complacency. In the words of Dr Michael Clamann, Senior Research Scientist at Duke University: “Anytime we focus on a task like operating a system and we’re not constantly interacting with it, we zone out after about 20 minutes. It’s human nature.”

The complacency effect is a well documented phenomenon that seriously threatens road safety. Sadly, even people who should know better fall into the trap of complacency – even companies who test autonomous vehicles. A police report released in June showed that the fatal crash of a self-driving Uber vehicle with a pedestrian in Tempe, Arizona, could have been avoided. The 44-year-old vehicle operator had been watching a TV show on her mobile phone right before the crash. "Our operators are expected to maintain attentiveness to the road ahead and react when the system fails to do so, and are trained to uphold this responsibility,” Uber said in a statement – training which clearly fell on deaf ears.

Uber halted all trialing operations immediately after the crash and announced last week that it will lay off about 100 vehicle operators. It intends to replace them with 55 so-called “mission specialists”. According to Uber, those specialists are trained in both on-road and more advanced test-track operations.

While the term “mission specialist” does sound catchy, I am missing an important detail: transparency. What are the pre-requisites one must possess to be a safety operator in one of Uber’s autonomous vehicles? Has their training since been modified to prevent a repeat of such complacency and increase road safety? Given their record of dubious business practices, Uber will always be subject to heightened public scrutiny. It takes years to build up trust but only seconds to destroy it, as they say. This goes for people, companies and technologies.

Follow me: Autonomous trucks for safer roadworks

While on the topic of road safety: for years, shifting roadworks on highways have been a frequent source for severe traffic accidents. To warn other road users and protect the construction workers, a safety truck would normally follow at a set distance from the actual worksite. But time and again, semitrailers or other vehicles would crash into the safety truck – which makes the task of driving it a rather dangerous one. Enter MAN

Their autonomous safety truck “Afas” was deployed for the first time on June 20 on the hard shoulder of the busy A3 highway in Germany. In standard operation, it followed the leading construction truck at a distance of 100 meters (328 feet). When things got tricky, e.g. on exit ramps, it would pull up close to the leading vehicle. Finally, if there was a data mismatch (or if triggered manually), it would come to a safe stop. According to German newspaper FAZ, it was the first time an autonomous truck took to the highway in real-life conditions without a human even present in the driver’s cabin.

This trial proves two things. First, trust is best created through successful operation, not press announcements (could MAN give Uber a call?). Second, the deployment of autonomous trucks edges closer. According to MAN, the system is not far from series production. Which drives home the point we at 2025AD have been making for a while: that self-driving trucks are one of the most obvious business cases of autonomous technology applied.

Let’s talk about hacks, baby

Should law enforcement be allowed to remotely stop a driverless car in the name of law? This question has already spurred a larger moral and legal debate. According to Reuters, the debate has reached U.S. government levels. In a meeting of U.S. transportation regulators in March, law enforcement officials expressed interest in being able to potentially control autonomous vehicles during emergencies. The downside: this would also open up a gateway for criminal hackers. Reuters cites a report of the meeting, saying that participants “agreed that it is a question of when, not if, there is a massive cyber security attack targeting” autonomous vehicles and “planning exercises are needed to prepare for and mitigate a large-scale, potentially multimodal cyber security attack.”

Now that is what I call a blunt statement. Still, the report Reuters is citing was made public by the U.S. Transportation Department. But if we think about it for a minute, this, in itself, is actually a great contribution to winning acceptance of automated driving. Because it would be naive to assume automated driving systems will ever be completely immune to cyber-attacks – just like online banking will never be completely immune to criminal hacking. Pretending there is no risk is akin to deceiving the public. Telling the truth means taking the public seriously.

So long, drive safely (until cars are driverless),                        

Stephan Giesler

Editor-in-Chief, 2025AD

Article Interactions

5 shares
5 2 votes
Rate this article on a scale of 1 to 5.

Quick Newsletter Registration!

Subscribe to our monthly newsletter to receive a roundup of news, blogs and more from 2025AD directly to your mailbox!