Sister blog of Physicists of the Caribbean in which I babble about non-astronomy stuff, because everyone needs a hobby

Saturday, 24 March 2018

Uber's self-driving car crash

The firm that designed the sensors on the Uber self-driving car that killed a woman this week has said its technology was not to blame. San Jose-based Velodyne told the BBC it was "baffled" by the incident, adding its equipment was capable of seeing in the dark. Video of the incident was published by investigators earlier on Wednesday. It showed Ms Herzberg walking with her bicycle, away from a pedestrian crossing. Neither the car - nor its human driver - reacted.

Velodyne Lidar president Marta Hall told the BBC it would not be advising its customers to halt tests in the wake of the Arizona death because "we do not believe the accident was due to Lidar". Instead, the company is pointing to Uber's on-board computer as potentially being to blame, Ms Hall said.

"Our Lidar can see perfectly well in the dark, as well as it sees in daylight, producing millions of points of information. However, it is up to the rest of the system to interpret and use the data to make decisions. We do not know how the Uber system of decision-making works."
http://www.bbc.com/news/technology-43523286

8 comments:

  1. Well, I don't know well the situation, butbon attached picture it looks it is straight road and human on it. Even human driver in such circumstances should slow down and not hit this person.

    Take into account she is passing fro left to right so definitely he do not "jump straight" to the car's wheels. She just walked along whole left part of the road. There is no suprise here she moved on. It is hard to say, but I would say she doesn't run.

    Somay preliminary opinion, there's something deeply wrong with this particular driving agent, not matter it was human or computer systems, and it looks like a person using this car is responsible for this accident.

    ReplyDelete
  2. Kazimierz Kurz The speed of the car is not seen in the picture. If you'd seen the video version of it, she's hard to see until moments before the accidents. I'd say that a human driver would have had the chance to lower the speed with which the accident occured, but not prevent it.

    I would have thought a self-driving car would be able to "see" the object earlier and prevent the accident.

    ReplyDelete
  3. This picture is clearly taken a few fractions of second before hit.
    But as I said road seems to be straight there, so decision about slowing down should be taken a 100 meters before I suppose. And for human driver is is something clear, if he is able to see a person on the road. For me it looks like it was wrong decision.

    ReplyDelete
  4. Don't believe the dash cam video. The dash cam is either extremely poor quality, or the video was doctored, deceptively reducing brightness. Here are some videos showing the same road at night - notice how everything is lit well enough to drive.

    Also, in real life, human eyesight is able to see more from headlights than just that ridiculously short range view just in front of the headlights. Can you imagine driving at night with such poor illumination? You'd be crashing into things every night!

    The Uber system utterly failed, in what is not anywhere near any sort of edge case.

    The safety driver also was not paying attention, but we'll have to see whether this was a result of Uber expecting too much from the humans by overloading them with tasks that had previously been split among two crew members.

    arstechnica.com - Police chief said Uber victim “came from the shadows”—don’t believe it

    ReplyDelete
  5. If you want to get an idea of what typical lidar data from a veldyne looks like go here http://www.cvlibs.net/datasets/kitti/ I've used this dataset for much of my research.
    cvlibs.net - The KITTI Vision Benchmark Suite

    ReplyDelete
  6. Michel Kangro Isaac Kuo has it right -- this was a critical failure by Uber, not a case where it was close. 9 out of 10 human drivers would have taken action to try and reduce the accident, and 4 or 5 out of 10 would have avoided it entirely. I'm afraid that a failure of this magnitude happening to Uber really doesn't surprise me, as I've been following details the efforts of Tesla, Google, and Uber -- and Uber's object detection was clearly much worse than the others. Below is a video of Tesla's object detection software in action (watch the middle view on the right, full screen) as an example of what good object detection looks like. My conclusions about the Uber level of object detection come from a video of the same type, but I'm afraid my searching hasn't rediscovered the link to share.

    If you want to take a look at the location using Google satelleite and StreetView, here's the location: https://www.google.co.uk/maps/place/Marquee+Theatre/@33.4369491,-111.9446801,550m/data=!3m1!1e3!4m5!3m4!1s0x872b0931ce8f494b:0x8512d47678881f8c!8m2!3d33.4369477!4d-111.9439989?hl=en. Ms Herzberg was moving from the odd x-shaped pavements that are to the south-east of the theater, and was going north-northeast across the North Mill Avenue.

    This critical failure by Uber shows that it's possible to do a really bad job of making a self-driving car -- but not much more.

    theverge.com - Tesla shows what its self-driving cars see while on the road

    ReplyDelete
  7. Michel Kangro other videos of that area at night paint a different picture of visibility, and the headlights on the car would have to be broken to provide so little illumination so close.

    I wouldn't drive a car with headlights that bad at night.

    ReplyDelete

Due to a small but consistent influx of spam, comments will now be checked before publishing. Only egregious spam/illegal/racist crap will be disapproved, everything else will be published.

Whose cloud is it anyway ?

I really don't understand the most militant climate activists who are also opposed to geoengineering . Or rather, I think I understand t...