Uber self-driving crash: Software set to ignore objects on road

A self-driving Uber that struck and killed an Arizona pedestrian in March may have been set to ignore “false positives,” according to a report about the company’s investigation.

Source: Uber self-driving crash: Software set to ignore objects on road

RB note: I’m posting this tragedy because of the last paragraph. Uber had adjusted the sensitivity threshold on its obstacle detection algorithm. Too high ==> too many false positives ==> car keeps jamming on the breaks ===> passengers are uncomfortable. So they lowered it (detect fewer events).
Too low ===> ignore a few obstacles on the road ===> hit a bike or pedestrian.

According to the Information’s report on Uber’s investigation, the company may have tuned the self-driving software to not be too sensitive to objects around it because it is trying to achieve a smooth self-driving ride. Other autonomous-vehicle rides can reportedly be jerky as the cars react to perceived threats — that are sometimes non-existent — in their way.

It’s not mentioned in this short article, but even with a less sensitive algorithm, the car should have been continuously updating the probability of an obstacle. So as they got closer, it should have eventually jammed on the brakes. It still would have hit the pedestrian, but at a lower speed, possibly not killing her. So not only did they set the threshold wrong, but perhaps their real-time updating method was nonexistent or ineffective.

The original article, with much more detail, is here. (email required) It includes the following paragraph:

Hiring better drivers and giving them better tools to avoid such accidents—such as visual or audio alerts when the system decides to ignore certain objects it doesn’t think they’re threats—also may be necessary. (emphasis added)

Frankly, that sounds pretty obvious. A detection system should have multiple levels of sensitivity

  • Least sensitive when the safety driver is clearly looking at the road.
  • Intermediate sensitive, and sound an alarm to the driver, when it detects a possible obstacle and he/she is not looking at the road.
  • Most sensitive when driver is heavily distracted, and a model of driver behavior suggests they will take a long time to respond after an alarm.

But as my friend Don Norman would point out, there are lots of feedback loops operating here, whose effects need to be determined. In the long run, drivers may be retraining cars. But even in a few hours of driving, an automated system will “train” a driver. If the system always beeps for obstacles, a driver might get even more complacent.

Author: Roger Bohn

Professor of Technology Management, UC San Diego. Visiting Stanford Medical School Rbohn@ucsd.edu. Twitter =Roger.Bohn

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s