Self-driving Uber car that hit and killed woman did not recognize that pedestrians jaywalk

Spectre

The Deported
Joined
Feb 1, 2007
Messages
36,778
Location
Dallas, Texas
Car(s)
00 4Runner | 02 919 | 87 XJ6 | 86 CB700SC
An update from here: Uber Disabled Volvo SUV's Safety System Before Fatality - The NTSB has released its report. More pics and links at source.

Self-driving Uber car that hit and killed woman did not recognize that pedestrians jaywalk
The automated car lacked "the capability to classify an object as a pedestrian unless that object was near a crosswalk," an NTSB report said.

A self-driving Uber car that struck and killed an Arizona woman was not able to recognize that pedestrians jaywalk, the National Traffic Safety Board revealed in documents released earlier this week.

Elaine Herzberg, 49, died after she was hit in March 2018 by a Volvo SUV, which had an operator in the driver's seat and was traveling at about 40 miles per hour in autonomous mode at night in Tempe.

The fatal accident came as a result of this automated Uber not having "the capability to classify an object as a pedestrian unless that object was near a crosswalk," one of the NTSB documents said.

Because the car could not recognize Herzberg as a pedestrian or person — instead alternating between classifications of "vehicle, bicycle, and an other" — it could not correctly predict her path and concluded it needed to brake just 1.3 seconds before it struck her as she wheeled her bicycle across the street a little before 10 p.m. at night.

Uber told the NTSB that it "has since modified its programming to include jaywalkers among its recognized objects," but other concerns were also expressed in NTSB's report.

Uber had disabled the emergency braking system, relying on the driver to stop in this situation, but the system was not designed to alert the operator, who only "intervened less than a second before impact by engaging the steering wheel," the documents said.

That safety driver was working alone — a recent change in procedure — and didn't keep her eyes on the road, the report said. She was streaming the television show "The Voice," according to a police report cited by NBC Philadelphia.

The NTSB also noted that Uber's Advanced Technologies Group had a technical system safety team in place, but failed to "have a standalone operational safety division or safety manager." The company also "did not have a formal safety plan, a standardized operations procedure (SOP) or guiding document for safety."

Sarah Abboud, an Uber spokeswoman, told Reuters that the company regretted the crash, but said Uber's automated program has “adopted critical program improvements to further prioritize safety. We deeply value the thoroughness of the NTSB’s investigation into the crash and look forward to reviewing their recommendations.”

Between September 2016 and March 2018, Uber's test vehicles were involved in 37 crashes while driving autonomously, but only two were as a result of a car's failure to identify a roadway hazard.

Herzberg's family settled with Uber out of court.

Uber announced it had relaunched its self-driving cars nine months after the accident.
https://www.nbcnews.com/tech/tech-news/self-driving-uber-car-hit-killed-woman-did-not-recognize-n1079281

  1. What idiot programmed this?????
  2. Holy crap, Uber's ATC is a bunch of idiots.
 

Spectre

The Deported
Joined
Feb 1, 2007
Messages
36,778
Location
Dallas, Texas
Car(s)
00 4Runner | 02 919 | 87 XJ6 | 86 CB700SC
I see no issue with this... Add logic to run over cyclists and I know which self driving car i want
Yes, but then you realize that their system may not correctly classify large animals that may wander into the roadway. Receiving a cow to the face might tend to ruin your whole day.
 

MWF

Now needs wood
Joined
May 29, 2008
Messages
27,751
Location
MWF HQ, Ukadia
Car(s)
MX-5 1.8i Indiana SE, update pending
This is idiocy on a whole new scale.
  1. Allowing autonomous vehicles anywhere they could encounter a cyclist or pedestrian is criminal.
  2. Looking before you cross the road is generally considered a good idea.
 

Spectre

The Deported
Joined
Feb 1, 2007
Messages
36,778
Location
Dallas, Texas
Car(s)
00 4Runner | 02 919 | 87 XJ6 | 86 CB700SC
Hmm good point, though not many cows around here...
Replace "cow" with deer, elk, moose, bear, mother-in-law or other large beast found in North America as required. :p

More seriously, I'm wondering what it makes of all the other moving stuff that can enter the roadway - such as tumbleweeds, plastic bags, newspapers, windblown boxes, etc.
 

prizrak

Forum Addict
Joined
Apr 2, 2007
Messages
20,888
Location
No, sleep, till, BROOKLYN
Car(s)
11 Xterra Pro-4x, 12 'stang GT
Allowing autonomous vehicles anywhere they could encounter a cyclist or pedestrian is criminal.
They have to know how to deal with all of that
More seriously, I'm wondering what it makes of all the other moving stuff that can enter the roadway - such as tumbleweeds, plastic bags, newspapers, windblown boxes, etc.
That's a bigger concern, forgetting the non-existent large animals of NYC I would assume basic logic of any self driving software would be to not hit shit that's in front of it.
 
  • Like
Reactions: MWF

Eye-Q

Forum Addict
DONOR
Joined
Feb 17, 2007
Messages
5,419
Location
Hamburg, Autobahnland
Car(s)
None anymore...
That's a bigger concern, forgetting the non-existent large animals of NYC I would assume basic logic of any self driving software would be to not hit shit that's in front of it.
Have you read the article? For the self driving software the woman wasn't "in front" up until 1.3 seconds before impact which was way too late. No human driver will ever re-classify a cyclist, pedestrian or whatever entering the road perpendicular to it multiple times within a second which means every human driver (who isn't distracted) will correctly predict the cyclist, pedestrian or whatever might continue it's collision path with the car and brake.

The issue is mostly about prediction. Currently most humans predict the behaviour of other road users better than self driving software since
a) self driving software has been written by other humans (who experienced mostly the conditions and behaviour of other road users in their vicinity which can differ extremely from conditions and behaviour of other road users in other countries) and
b) this kind of software has to work in every condition (light, dark, fog, back light, ice on the road, roads with and without markings etc.) so it has to be pretty conservative to cover every single eventuality. Driving conservatively isn't what most people do though so a "conservatively" driving autonomous car will be driving "slowly"/"erratically" for those people.
 

Spectre

The Deported
Joined
Feb 1, 2007
Messages
36,778
Location
Dallas, Texas
Car(s)
00 4Runner | 02 919 | 87 XJ6 | 86 CB700SC
Have you read the article? For the self driving software the woman wasn't "in front" up until 1.3 seconds before impact which was way too late. No human driver will ever re-classify a cyclist, pedestrian or whatever entering the road perpendicular to it multiple times within a second which means every human driver (who isn't distracted) will correctly predict the cyclist, pedestrian or whatever might continue it's collision path with the car and brake.
Except even before she was directly in front of the car, it apparently had problems figuring out what it was. Mentioned in the article, but I'll quote directly from the press release regarding the preliminary report, available here: https://www.ntsb.gov/news/press-releases/Pages/NR20180524.aspx

The report states data obtained from the self-driving system shows the system first registered radar and LIDAR observations of the pedestrian about six seconds before impact, when the vehicle was traveling 43 mph. As the vehicle and pedestrian paths converged, the self-driving system software classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path. At 1.3 seconds before impact, the self-driving system determined that emergency braking was needed to mitigate a collision. According to Uber emergency braking maneuvers are not enabled while the vehicle is under computer control to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.


(Numbers above are distances in meters.) The system actually had seen the person in the roadway as a bicycle with at least six seconds and 25 meters to stop from 43mph. It could see that the paths were going to converge, but it couldn't identify what the converging object was and it became obsessed with classifying just what it was it had seen, to the point that it apparently made assumptions of the bicyclist's path based on what the system thought it was - instead of simply checking the RADAR and LIDAR track and deciding it needed to brake - until it was too late. Six seconds is more than enough for most people to figure out there was something there and hammer the brake pedal.

I would also say that in this stage of development, if the system is uncertain of what it's seeing other than the detected object is on a collision course, it should begin light braking and alert the safety driver. This one didn't.

The issue is mostly about prediction. Currently most humans predict the behaviour of other road users better than self driving software since
a) self driving software has been written by other humans (who experienced mostly the conditions and behaviour of other road users in their vicinity which can differ extremely from conditions and behaviour of other road users in other countries) and
b) this kind of software has to work in every condition (light, dark, fog, back light, ice on the road, roads with and without markings etc.) so it has to be pretty conservative to cover every single eventuality. Driving conservatively isn't what most people do though so a "conservatively" driving autonomous car will be driving "slowly"/"erratically" for those people.
No, the problem isn't about prediction. The problem was the system got hung up in identifying just what it was seeing on the road (per the logs and NTSB investigation!) instead of realizing it needed to stop first and identify the object later.

I would also point out that it seems likely that this software is being written by humans who are not drivers experienced in a wide variety of driving conditions nor do they seem to have asked people who were, because the logic used here was absolute crap. When something comes leaping into your path from the roadside with only a couple seconds to a potential collision, a driver does not (or at least should not) pause to try to identify what exactly that item is. The human decision tree comes down to: Sudden obstacle!-which way is it going and how fast?-how do I steer or brake to avoid collision? You don't freeze and think, "Is that a deer? Is it a feral hog? Is it a tumbleweed? Is it a fat guy who consumed too much beer at the local fall festival? Is it two dudes running from Nazis on a motorcycle with a sidecar?"


No, you brake and evade first and ask stupid questions later.
 
Last edited:

thevictor390

Teen Wankeler
Joined
Mar 9, 2007
Messages
11,870
Location
Massachusetts
Car(s)
'17 Mazda MX-5 RF, '89 Toyota Blizzard SX5
The system also has an automatic built in delay of 1 second before taking any evasive action. So even though it identified that evasion needed to occur 1.3 seconds before collision (way too late), it did not attempt to do so until 0.3 seconds before collision (no chance to even reduce speed).
 

DanRoM

Forum Addict
Joined
Feb 27, 2009
Messages
8,067
Location
Ruhr Area, Germany
Car(s)
MX-5 ND, CBF1000 & two bikes
I also read articles about this and as a software engineer by trade, I am offended by the stupidity of this software. There is just no excuse for such a dumb implementation.
 
Top