• After 15+ years, we've made a big change: Android Forums is now Early Bird Club. Learn more here.

Autonomous Cars: Not Yet Ready for Prime Time

These are not autonomous vehicles by any meaningful definition.

The industry defines 6 levels of autonomy, from level 0 (none) to level 5 (can handle anything a human driver can). Systems like Tesla's "autopilot" (and similar from a couple other manufacturers) are level 2 on this scale, which is basically "glorified cruise control" (driver can disengage from wheel and pedals simultaneously, but must monitor continually and be ready to take control at any moment). In other words there are no conditions under which safety critical functions can be wholly delegated to the vehicle at this level of autonomy (if that's possible in some conditions you reach level 3).

The problem is that at least some people owning such vehicles seem incapable of accepting just how limited they are. Some have suggested that marketing names like "autopilot" don't help people appreciate the limits, and certainly there's a "tension" between what companies say when thing go wrong and the impression that their marketing gives of the system's capabilities.

So should systems at this level be banned? I think there are two possible arguments. Firstly that this limited level of competence will tend to promote disengagement of the driver despite their constant attention actually still being required, even if they intellectually understand the limitations. And secondly that a sufficiently large number of people will not be capable of judging the system's competence accurately: they'll see it handle a few simple circumstances fine and will overestimate its capabilities, however the limitations are explained to them, because their experience is the things they see it do and they've no way of appreciating the many ways in which its limitations can manifest.

Personally I don't think I'd want anything less than Level 4 (capable of managing an entire trip fully autonomously), so I'm quite open to systems like this being taken off the road. Though if we can find a way or removing distracted/angry/impatient/macho drivers I'd still give that a higher priority ;) (though the two are not exclusive).
 
Upvote 0
I saw this on the BBC and thought you should see it:

This car is on Autopilot. What happens next? - http://www.bbc.co.uk/news/business-44460980

I rest my valise.

To defend, attempt to mitigate or apportion blame is well... disgusting.

These should be banned for the foreseeable future, imv.

Wow that's scary. No I certainly wouldn't trust that car. It's basically just a glorified cruise control.
 
  • Like
Reactions: Member2035261
Upvote 0
These are not autonomous vehicles by any meaningful definition.

The industry defines 6 levels of autonomy, from level 0 (none) to level 5 (can handle anything a human driver can). Systems like Tesla's "autopilot" (and similar from a couple other manufacturers) are level 2 on this scale, which is basically "glorified cruise control" (driver can disengage from wheel and pedals simultaneously, but must monitor continually and be ready to take control at any moment). In other words there are no conditions under which safety critical functions can be wholly delegated to the vehicle at this level of autonomy (if that's possible in some conditions you reach level 3).

The problem is that at least some people owning such vehicles seem incapable of accepting just how limited they are. Some have suggested that marketing names like "autopilot" don't help people appreciate the limits, and certainly there's a "tension" between what companies say when thing go wrong and the impression that their marketing gives of the system's capabilities.

So should systems at this level be banned? I think there are two possible arguments. Firstly that this limited level of competence will tend to promote disengagement of the driver despite their constant attention actually still being required, even if they intellectually understand the limitations. And secondly that a sufficiently large number of people will not be capable of judging the system's competence accurately: they'll see it handle a few simple circumstances fine and will overestimate its capabilities, however the limitations are explained to them, because their experience is the things they see it do and they've no way of appreciating the many ways in which its limitations can manifest.

Personally I don't think I'd want anything less than Level 4 (capable of managing an entire trip fully autonomously), so I'm quite open to systems like this being taken off the road. Though if we can find a way or removing distracted/angry/impatient/macho drivers I'd still give that a higher priority ;) (though the two are not exclusive).

Lol! I used the same term as you, without first reading your response here. "Glorified cruise control" is right, and I suspect a lot of drivers do over-estimate what a car such as this can do. From all the marketing hype and publicity I've seen around Tesla, my impression was that the cars were a lot more capable than shown in this demo.
 
Upvote 0
I think we should abandon the ground altogether, and go straight to fully autonomous flying vehicles. That's the future.

So you wanna get all these people who already can't drive to pilot vehicles that, once they again run into each other, then will fall to the ground and kill each driver as well as the unlucky sod(s) underneath when it happens?!?!?!? Can't go there with autonomous without humans first experimenting with live subjects to show it is safe for the AI's :rolleyes:
 
Upvote 0
So you wanna get all these people who already can't drive to pilot vehicles that, once they again run into each other, then will fall to the ground and kill each driver as well as the unlucky sod(s) underneath when it happens?!?!?!? Can't go there with autonomous without humans first experimenting with live subjects to show it is safe for the AI's :rolleyes:

People wouldn't be piloting anything. Take them out of the equation, as some clearly can't be trusted to drive cars sensibly.
This wouldn't be happening overnight. Obviously the hardware has to be developed. And in time the software will be up to the task.
 
  • Like
Reactions: Zigman66
Upvote 0
It should be easier to avoid collisions when you have 3 dimensions to work in. It's taking off and landing, where things get bunched together, that are riskiest.

But for sure keep humans out of the loop for that. We know they wouldn't have the discipline to operate flying cars safely. Even movies know this: either there are very few flying cars (Blade Runner) or they (mostly) fly in well-defined lanes (Star Wars).

(Curious: Gboard was very slow to predict "Wars" following "Star". And failed to predict "Gboard". SwiftKey definitely does that better...)
 
Upvote 0
I know this is an old thread, but there were some recent developments.

What do you know, it wasn't programmed to recognize j-walkers. :rolleyes:

https://www.engadget.com/2019/11/06...b5KjXYA8F0lyU1f-acLZo6SPKyAbj5KF&guccounter=2
The report states: “Although the [system] detected the pedestrian nearly six seconds before impact … it never classified her as a pedestrian, because she was crossing at a location without a crosswalk [and] the system design did not include a consideration for jaywalking pedestrians.”
 
Upvote 0
I'm sure that crosswalks are easy to detect, and so only flagging pedestrians when they are on those will be a simple programming problem.

But everyone involved in authorising that, and in authorising a vehicle with that programming to travel on the road, and who didn't inform the safety driver of this limitation (because I'm sure they didn't), should be sacked and face charges. That's unforgiveable (though if you'd asked me 2 years ago to guess which company would have cut a corner like that Uber would have been top of my list).
 
Upvote 0
I'm sure that crosswalks are easy to detect, and so only flagging pedestrians when they are on those will be a simple programming problem.

But everyone involved in authorising that, and in authorising a vehicle with that programming to travel on the road, and who didn't inform the safety driver of this limitation (because I'm sure they didn't), should be sacked and face charges. That's unforgiveable (though if you'd asked me 2 years ago to guess which company would have cut a corner like that Uber would have been top of my list).
I look at it as an inherent risk in trying to program human thought into a machine. Humans have to think of everything they think of (and stuff they don't think of which is more scary) in every scenario and adequately program that into some machine.

And I'll circle back to saying the biggest impediment to self driving cars becoming mainstream will be lawyers. Was it incompetence or negligence? Who's incompetence or negligence? Was it really a foreseeable event? Was the critical failure the vehicle manufacturer (SAAB), the operator (UBER), the safety backstop (the human whatever), or any one of the many sensor manufacturers, developers, and whatnot. There are more than enough ambulance chasers out there to make self driving cars so cost prohibitive up and down the supply chain that it just won't work.
 
Upvote 0

BEST TECH IN 2023

We've been tracking upcoming products and ranking the best tech since 2007. Thanks for trusting our opinion: we get rewarded through affiliate links that earn us a commission and we invite you to learn more about us.

Smartphones