Self-driving Uber car that hit and killed woman did not recognize that pedestrians jaywalk

or the other way around?

let the auto pilot run in the background, without doing anything but just constantly verify if it would've done the same thing as the driver did, and modify itself accordingly

Absolutely *not*. Self-modifying AI code is terrifyingly bad.

What I think they should be doing for the time being is not have the cars drive themselves but run in logging mode, so you could see what the car WOULD have done in this situation and see what the driver actually DID and then compare. And to test the software actually doing stuff they could use a controlled environment with like animatronic mannequins or some shit.

That's one idea. What I'd like to see before they get to the controlled real world environment is to interface it with a world-sim of the same general idea as ARMA3. Let it drive around that environment *first*... then invite players in to help test it with random encounters. In the meantime, sure, have it watch actual drivers and compare what the AI would do to what the drivers did.
 
Last edited:
it worked perfectly...it adapted to the environment it was trained in, and started behaving in the same fashion
they just took a crap environment and the PC police got on their hind feet, but the AI did what it was asked to do

Yes. And what happens if the AI is paired with a bad/unsafe Uber driver? (They do exist.)

It's going to learn those bad habits. Quickly.
 
it worked perfectly...it adapted to the environment it was trained in, and started behaving in the same fashion
they just took a crap environment and the PC police got on their hind feet, but the AI did what it was asked to do
In case of driving the problem would be losing predictability of results. At this stage we would want to know exactly how it would respond to every situation.
 
Top