1. Download our Official Android App: Forums for Android!

Autonomous Cars: Not Yet Ready for Prime Time

Discussion in 'Automotive' started by rootabaga, Mar 19, 2018.

  1. dan330

    dan330 Extreme Android User
    Rank:
    None
    Points:
    1,143
    Posts:
    12,494
    Joined:
    Jan 22, 2010

    Jan 22, 2010
    12,494
    4,007
    1,143
    wow.. it is interesting the bias and fear that AI cause.
    unreasonable fear.. in this situation of cars with AI.

    AI .. in cars... will always be better on the whole.. vs Humans.

    even with AH drivers.. being aggressive on the road...
    AI-cars will react better and faster.. to let the AH-car drive away... no fuss / no muss.
    Humans will react un-favorably & un-safely against the AH-car.

    this morning.. a car cut me off...
    and i followed him closely .. tailed him for a while to mess with him...
    this was a NO-NO. and could have escalated.
    but an AI, would have just ignored it..
     

    Advertisement

  2. Unforgiven

    Unforgiven ...eschew obfuscation...
    Moderator
    Rank:
     #2
    Points:
    4,488
    Posts:
    38,325
    Joined:
    Jun 23, 2010

    Jun 23, 2010
    38,325
    47,588
    4,488
    Male
    Douglas, MA
    Unless the sensor didn't detect the car cutting it off, or it did, but the programming didn't recognize the threat from the sensor and didn't tell the steering, braking, accelerator actuators to react, and if it did tell them to react the calculations were correct, and the actuators did what the programming expected them to do, and the road conditions (e.g. sand, ice, heat, new asphalt, dirt road, etc.) performed the way the computer expected them too, as well as the other car not making an unanticipated (unprogrammed) adjustment. Once the automated car reacts all these sensors and systems need to alos reprocess the all the information once the corrective actions are put in place.

    Getting back to my "lawyers will be the problem" statement, unless one entity is responsible for all of those systems, the lawyers will sue everyone else. It will be a big, legal, finger pointing. Look at the case the started this thread. The company that made the safety systems for Volvo blamed uber for turning them off.
     
    Member2035261 likes this.
  3. doubletop

    doubletop Lurker
    Rank:
    None
    Points:
    5
    Posts:
    2
    Joined:
    May 15, 2018

    May 15, 2018
    2
    0
    5
  4. rootabaga

    rootabaga Extreme Android User
    Thread Starter
    Rank:
     #8
    Points:
    2,473
    Posts:
    7,381
    Joined:
    Aug 11, 2014

    Aug 11, 2014
    7,381
    27,395
    2,473
    More or less IT, lots of hand-holding
    Crazyville, CA
    Member2035261 likes this.
  5. Member2035261

    Rank:
    None
    Posts:
    0
    Joined:

    I saw this on the BBC and thought you should see it:

    This car is on Autopilot. What happens next? - http://www.bbc.co.uk/news/business-44460980

    I rest my valise.

    To defend, attempt to mitigate or apportion blame is well... disgusting.

    These should be banned for the foreseeable future, imv.
     
    mikedt likes this.
  6. Hadron

    Hadron Smoke me a kipper...
    VIP Member
    Rank:
     #9
    Points:
    2,468
    Posts:
    24,789
    Joined:
    Aug 9, 2010

    Aug 9, 2010
    24,789
    19,235
    2,468
    Spacecorp Test Pilot
    Dimension Jumping
    These are not autonomous vehicles by any meaningful definition.

    The industry defines 6 levels of autonomy, from level 0 (none) to level 5 (can handle anything a human driver can). Systems like Tesla's "autopilot" (and similar from a couple other manufacturers) are level 2 on this scale, which is basically "glorified cruise control" (driver can disengage from wheel and pedals simultaneously, but must monitor continually and be ready to take control at any moment). In other words there are no conditions under which safety critical functions can be wholly delegated to the vehicle at this level of autonomy (if that's possible in some conditions you reach level 3).

    The problem is that at least some people owning such vehicles seem incapable of accepting just how limited they are. Some have suggested that marketing names like "autopilot" don't help people appreciate the limits, and certainly there's a "tension" between what companies say when thing go wrong and the impression that their marketing gives of the system's capabilities.

    So should systems at this level be banned? I think there are two possible arguments. Firstly that this limited level of competence will tend to promote disengagement of the driver despite their constant attention actually still being required, even if they intellectually understand the limitations. And secondly that a sufficiently large number of people will not be capable of judging the system's competence accurately: they'll see it handle a few simple circumstances fine and will overestimate its capabilities, however the limitations are explained to them, because their experience is the things they see it do and they've no way of appreciating the many ways in which its limitations can manifest.

    Personally I don't think I'd want anything less than Level 4 (capable of managing an entire trip fully autonomously), so I'm quite open to systems like this being taken off the road. Though if we can find a way or removing distracted/angry/impatient/macho drivers I'd still give that a higher priority ;) (though the two are not exclusive).
     
    LV426, Member2035261 and mikedt like this.
  7. LV426

    LV426 I say we take off and nuke this place from orbit
    Recognized Developer
    Rank:
    None
    Points:
    1,988
    Posts:
    9,176
    Joined:
    Oct 16, 2015

    Oct 16, 2015
    9,176
    13,681
    1,988
    Male
    Software developer
    South West of England
    Wow that's scary. No I certainly wouldn't trust that car. It's basically just a glorified cruise control.
     
    Member2035261 likes this.
  8. LV426

    LV426 I say we take off and nuke this place from orbit
    Recognized Developer
    Rank:
    None
    Points:
    1,988
    Posts:
    9,176
    Joined:
    Oct 16, 2015

    Oct 16, 2015
    9,176
    13,681
    1,988
    Male
    Software developer
    South West of England
    Lol! I used the same term as you, without first reading your response here. "Glorified cruise control" is right, and I suspect a lot of drivers do over-estimate what a car such as this can do. From all the marketing hype and publicity I've seen around Tesla, my impression was that the cars were a lot more capable than shown in this demo.
     
    Member2035261 and Hadron like this.
  9. Davdi

    Davdi Android Expert
    Rank:
     #71
    Points:
    273
    Posts:
    1,236
    Joined:
    Jul 4, 2012

    Jul 4, 2012
    1,236
    670
    273
    Semi-retired developer
    The Shire
    Unforgiven and Member2035261 like this.
  10. LV426

    LV426 I say we take off and nuke this place from orbit
    Recognized Developer
    Rank:
    None
    Points:
    1,988
    Posts:
    9,176
    Joined:
    Oct 16, 2015

    Oct 16, 2015
    9,176
    13,681
    1,988
    Male
    Software developer
    South West of England
    I think we should abandon the ground altogether, and go straight to fully autonomous flying vehicles. That's the future.
     
  11. Zigman66

    Zigman66 Android Enthusiast
    Rank:
    None
    Points:
    503
    Posts:
    635
    Joined:
    Dec 1, 2017

    Dec 1, 2017
    635
    2,096
    503
    Male
    Contractor
    Richfield, WI
    So you wanna get all these people who already can't drive to pilot vehicles that, once they again run into each other, then will fall to the ground and kill each driver as well as the unlucky sod(s) underneath when it happens?!?!?!? Can't go there with autonomous without humans first experimenting with live subjects to show it is safe for the AI's :rolleyes:
     
  12. LV426

    LV426 I say we take off and nuke this place from orbit
    Recognized Developer
    Rank:
    None
    Points:
    1,988
    Posts:
    9,176
    Joined:
    Oct 16, 2015

    Oct 16, 2015
    9,176
    13,681
    1,988
    Male
    Software developer
    South West of England
    People wouldn't be piloting anything. Take them out of the equation, as some clearly can't be trusted to drive cars sensibly.
    This wouldn't be happening overnight. Obviously the hardware has to be developed. And in time the software will be up to the task.
     
    Zigman66 likes this.
  13. Hadron

    Hadron Smoke me a kipper...
    VIP Member
    Rank:
     #9
    Points:
    2,468
    Posts:
    24,789
    Joined:
    Aug 9, 2010

    Aug 9, 2010
    24,789
    19,235
    2,468
    Spacecorp Test Pilot
    Dimension Jumping
    It should be easier to avoid collisions when you have 3 dimensions to work in. It's taking off and landing, where things get bunched together, that are riskiest.

    But for sure keep humans out of the loop for that. We know they wouldn't have the discipline to operate flying cars safely. Even movies know this: either there are very few flying cars (Blade Runner) or they (mostly) fly in well-defined lanes (Star Wars).

    (Curious: Gboard was very slow to predict "Wars" following "Star". And failed to predict "Gboard". SwiftKey definitely does that better...)
     
  14. Unforgiven

    Unforgiven ...eschew obfuscation...
    Moderator
    Rank:
     #2
    Points:
    4,488
    Posts:
    38,325
    Joined:
    Jun 23, 2010

    Jun 23, 2010
    38,325
    47,588
    4,488
    Male
    Douglas, MA
    I know this is an old thread, but there were some recent developments.

    What do you know, it wasn't programmed to recognize j-walkers. :rolleyes:

    https://www.engadget.com/2019/11/06...b5KjXYA8F0lyU1f-acLZo6SPKyAbj5KF&guccounter=2
    The report states: “Although the [system] detected the pedestrian nearly six seconds before impact … it never classified her as a pedestrian, because she was crossing at a location without a crosswalk [and] the system design did not include a consideration for jaywalking pedestrians.”
     
  15. Hadron

    Hadron Smoke me a kipper...
    VIP Member
    Rank:
     #9
    Points:
    2,468
    Posts:
    24,789
    Joined:
    Aug 9, 2010

    Aug 9, 2010
    24,789
    19,235
    2,468
    Spacecorp Test Pilot
    Dimension Jumping
    I'm sure that crosswalks are easy to detect, and so only flagging pedestrians when they are on those will be a simple programming problem.

    But everyone involved in authorising that, and in authorising a vehicle with that programming to travel on the road, and who didn't inform the safety driver of this limitation (because I'm sure they didn't), should be sacked and face charges. That's unforgiveable (though if you'd asked me 2 years ago to guess which company would have cut a corner like that Uber would have been top of my list).
     
  16. Unforgiven

    Unforgiven ...eschew obfuscation...
    Moderator
    Rank:
     #2
    Points:
    4,488
    Posts:
    38,325
    Joined:
    Jun 23, 2010

    Jun 23, 2010
    38,325
    47,588
    4,488
    Male
    Douglas, MA
    I look at it as an inherent risk in trying to program human thought into a machine. Humans have to think of everything they think of (and stuff they don't think of which is more scary) in every scenario and adequately program that into some machine.

    And I'll circle back to saying the biggest impediment to self driving cars becoming mainstream will be lawyers. Was it incompetence or negligence? Who's incompetence or negligence? Was it really a foreseeable event? Was the critical failure the vehicle manufacturer (SAAB), the operator (UBER), the safety backstop (the human whatever), or any one of the many sensor manufacturers, developers, and whatnot. There are more than enough ambulance chasers out there to make self driving cars so cost prohibitive up and down the supply chain that it just won't work.
     

Share This Page

Loading...