Sister blog of Physicists of the Caribbean in which I babble about non-astronomy stuff, because everyone needs a hobby

Wednesday 13 July 2016

Self-driving cars should either be entirely self-driving or manual, not a bit of both

Tesla has admitted that its autopilot feature was activated when one of its cars crashed on Sunday. However, the electric carmaker has suggested that the function was not being used correctly at the time. The California-based carmaker has previously blogged that "customers using autopilot are statistically safer than those not using it at all".

Perhaps. But what exactly is the point of an auto-driving feature that still requires the driver to be engaged and alert and is only suitable for highways with centre dividers ? Isn't it inevitable that if you tell people they can take their hands off the wheel, they're going to become distracted ? Risk compensation is likely at work here : feeling safer makes you inclined to take more risks. However, to be fair to Tesla :

"This vehicle was being driven along an undivided mountain road shortly after midnight with autosteer enabled," a spokeswoman told the BBC, referring to autopilot's steering function. "This is contrary to the terms of use that are agreed to when enabling the feature and the notification presented in the instrument cluster each time it is activated. As road conditions became increasingly uncertain, the vehicle again alerted the driver to put his hands on the wheel. He did not do so, and shortly thereafter the vehicle collided with a post on the edge of the roadway.

If they've built in an alert to tell drivers to take control, I don't see what more they can do. Google's approach of fully self-driving may more ambitious but a better way to account for human psychology, if people are going to insist on not following the instructions.
http://www.bbc.com/news/technology-36783345

13 comments:

  1. It's only a driving aid, the driver has the responsibility to be in control at the end of the day.

    ReplyDelete
  2. My old man told me of when automatic transmissions were first installed in cars: his father said "Now every idiot will be on the road."

    Cruise control: same tiresome story, all the breathless reporting about accidents caused by these newfangled contraptions linked to the speed control.

    Now cometh the self-driving car. There's an old joke in my industry, so well known that robotics engineers will refer to to their safety routines as either Pilots or Collie Dogs.

    For in the distant future, aircraft will be fully automated, mostly for insurance purposes, but also to reduce fuel consumption. But people will stoutly resist the idea of a robot aircraft. To humour such as these, the airlines will still have a pilot standing at the doorway, the usual avuncular figure we've come to know, military flight experience. He won't be allowed to touch the controls except in cases of emergency. He'll go into the cockpit, close the door and nervous people will feel more at ease....

    But alongside all that automation, the airlines will commission the breeding and training of a dog. That dog will also be in the cockpit with the pilot, trained to bite him in the goolies should the pilot ever go to sleep.....

    ReplyDelete
  3. But what exactly is the point of an auto-driving feature that still requires the driver to be engaged and alert and is only suitable for highways with centre dividers ?

    Highway driving is often incredibly dull in my experience, to the point of danger: it is easy to loose concentration or even fall asleep. Which is why it is strongly recommended here to stop every two hours for a bit, and stop to take a nap as soon as you feel even slightly tired.
    An autopilot for only highway would still be pretty useful.

    Dan Weese _He won't be allowed to touch the controls except in cases of emergency._

    But that's precisely the moment for which you keep a human on board. Sometimes, something unexpected will happen that falls outside of the programming of the plane.
    Even if it happens once in ten million flights, that would be almost 4 crashes a year, and each time, people will ask why there wasn't someone in command.
    If your datalink is strong enough, you can afford keeping your pilots on the ground, but how perfect are those?
    And then again, there are the inevitable cases where hackers will take a plane over.

    This is one of the main reasons a driver will still be needed in those cars, in fact. Given the so far abysmal security of connected cars, you want to keep a way to override the system from the inside.

    ReplyDelete
  4. Some top end cars can replicate much of Tesla's autopilot. With lane keep assist, adaptive cruise control and auto braking you're not far off Tesla. It's widely accepted these are also just aids.

    ReplyDelete
  5. Tesla does not tell you to take your hands off the wheel. Quite the opposite, Tesla tells you not to take your hands off the wheel!

    Teslas only mistake, in my humble opinion, is the name they have chosen for their collection of driving assistants.

    ReplyDelete
  6. Dan Weese The problem is that that time is already here - and we have evidence that crashes have occurred because humans, when suddenly called upon to take control, are so used to being passengers that they don't react correctly. This was, for instance, what happened to the Air France flight that went down in the Atlantic a few years ago.

    ReplyDelete
  7. Robert Minchin In the beginning, all things are hard - so goes the Chinese proverb. Hi-tech just means this stuff doesn't quite work right, yet. That's my proverb. The technology is in its infancy. We'll have to come to terms with it, as we did with the automobile itself.

    ReplyDelete
  8. Robert Minchin Do not forget that already today, those some cases of accidents happening due to non-concentrating drivers being overwhelmed by suddenly having to take back control may be outweighed by those accidents prevented of "heavily assisted" driving compared to "pure human" driving.

    ReplyDelete
  9. Michel Kangro It's a possibility, but the statistics from Tesla certainly don't demonstrate it as yet. In the airline industry, it is certainly the case that automation has made things safer, but has also introduced new ways for things to go wrong.
    http://www.slate.com/blogs/the_eye/2015/06/25/air_france_flight_447_and_the_safety_paradox_of_airline_automation_on_99.html

    ReplyDelete
  10. Robert Minchin There will always be ways for things to go wrong. The idea has to be to eliminate the frequent and dangerous things to go wrong and try to replace them with less frequent and/or less dangerous things to go wrong.

    There have been less accidents per mile driven with Teslas ill-named assistant then with pure human driving. There have almost certainly been less dead per mile driven.

    A job well done.

    ReplyDelete
  11. I can certainly see the point for developers in having this sort of intermediate-stage semi-auto pilot. But I'm still not convinced it's at all practical in the real world. I don't see how it makes things at all easier for drivers if still need to pay attention in order to take control in event of an emergency - that just seems like it would make the process more boring and tiring, having to stay focused without having anything to actually do. If it is fully automatic in some circumstances, then drivers should be able to ignore what their car is doing, otherwise what's the point ? How does it make things easier to have your car do the incredibly tedious process of highway driving when it still requires your careful personal supervision ?

    The only way I can see this working if it is explicitly designed to be fully automatic in some circumstances, e.g. highway driving, since if the car drives itself it's inevitable that people will pay less attention to their car in those circumstances.

    ReplyDelete
  12. Why doesn't the feature adjust speed to fit with its confidence level?

    Nothing will get the driver to get over his fucking self faster than slowing down to 20 mph.

    ReplyDelete
  13. Michel Kangro​ I don't dispute the concept, but Tesla's statistics are definitely dodgy – highway driving is safer than ordinary driving, so if (as Tesla says it should) the autopilot is only used on highways, you'd expect a lower number of accidents and deaths even if it was doing no better than an average driver. Plus the Autopilot is, even according to Tesla's stats, less safe than the average British driver – indicating a problem with US driver standards, rather than humans per se.
    As it stands, the technology is inferior to humans. I don't doubt that it will become superior one day, as it already has four airliners. But Tesla's attempts to claim that day has already arrived are dishonest and, in the worst case, deadly.

    ReplyDelete

Due to a small but consistent influx of spam, comments will now be checked before publishing. Only egregious spam/illegal/racist crap will be disapproved, everything else will be published.

It's okay to like vinyl

Here's a nice if somewhat over-lengthy piece about why people prefer antiquated technologies like vinyl records instead of digital medi...