jebjeb
Active Member
I dont think it will ever be "non BETA" . It is always gathering data and sends it back for analysis and new conditions added.
Hmmm... how do I put that mildly?
No, there's no way to soften this: Motorbikes will be the biggest risk for life and limb on the road, if cars should become autonomous. Because with the general habit of speeding (and no, I am not exaggerating) and in general driving recklessly, no computer will have the intuition to expect the unexpected. Autonomous driving only works when computers can react to defined parameters. If someone isn't keeping to the rules, no computer will be able to handle that.
And yes, I know, you are the ONE motorbiker on the planet who always sticks to the rules
Hmmm... how do I put that mildly?
No, there's no way to soften this: Motorbikes will be the biggest risk for life and limb on the road, if cars should become autonomous. Because with the general habit of speeding (and no, I am not exaggerating) and in general driving recklessly, no computer will have the intuition to expect the unexpected. Autonomous driving only works when computers can react to defined parameters. If someone isn't keeping to the rules, no computer will be able to handle that.
And yes, I know, you are the ONE motorbiker on the planet who always sticks to the rules
So the computer "assumed"... interesting
When the light turned green, several cars ahead of the bus passed the SUV. Google has said that both the car?s software and the person in the driver?s seat thought the bus would let the Lexus into the flow of traffic. The Google employee did not try to intervene before the crash.
?This is a classic example of the negotiation that?s a normal part of driving ? we?re all trying to predict each other?s movements. In this case, we clearly bear some responsibility, because if our car hadn?t moved there wouldn?t have been a collision,? Google wrote of the incident.
Have you actually read the article you linked?
From the article:
What I have read is that Google is actually trying to get their self driving cars to behave a bit more like a human. Otherwise their cars get hit by inattentive humans that assume that the cars will behave like cars driven by humans.
I guess they would benefit from an L-plate, as in learner-plate.
The whole talk about "artificial intelligence" is bullshit as long as a computer can't evolve beyond its programming.
You people better be all over this shit like you are with FCA's electronic shifter.
Well, first of all: Excusing the accident with saying that even a human couldn't prevent the accident is a bit lame, when the whole point of the presentation above was to show that the car watches traffic BETTER than a human.
Secondly, a computer is always only as good as the human who programmed it. And humans make mistakes, as you surely know. Computers are, despite all the technical progress in the past decades, incredibly stupid. They can all do exactly one thing: Additions -- one at a time.
All the increase in computer capacity and speed over the last 60 years or so have only enabled computers to make additions faster. And as long as they haven't gotten a quantum processor made to work, computers will still be doing one addition at a time. The whole talk about "artificial intelligence" is bullshit as long as a computer can't evolve beyond its programming. Maybe one day they might be able to do so but even then they will still break at some point. All machines break eventually.
So entrusting human lives in traffic to a machine might work to a certain extent but at some point new variables that nobody could foresee, will add new dangers and risks.
In essence we will all be the lab rats for computers that try to replace human intuition and experience, which they will of course never be able to.
Over 30,000 people die on US roads every year, super majority of it is driver error, humans suck at foreseeing new variables and responding.So entrusting human lives in traffic to a machine might work to a certain extent but at some point new variables that nobody could foresee, will add new dangers and risks.
This is way worse than FCA's shifter issues, IMO. Tesla's system is just a glorified radar cruise control and lane keeping assist (it doesn't hook into the navigation system to know where to go, and it doesn't even pay attention to speed limit signs), but they hype it up calling it "autopilot" and talking at conferences about self-driving cars all the time. I really hope they're forced to change the name, if nothing else. It pretty much begs people to misuse it in a way that the FCA shifter doesn't.
So... it's not suited for Norway? XD
"Autosteer... is best suited for highways with a centre divider.
"We specifically advise against its use at high speeds on undivided roads."
So... it's not suited for roads? XD
Roads with dividers, that's fancy stuff, like they got in big cities.
So you can learn where it is having problems and fine tune the system, just like every other time a beta is released to the public.