Many questions arise when one considers the concept of the self-driving Uber, but none is more important than “Is it safe?” As the details of a fatal accident from March involving one of the self-driving rideshare vehicles are released, it’s clear that there are still dangerous kinks in the software.
The March incident, which unfolded in Tempe, AZ, left one pedestrian dead despite the fact that the software recognized her. New details of the accident, which include video footage of the accident, indicate that the software had no issues identifying 49-year-old Elaine Herzberg, but still didn’t act accordingly and stop on time.
Uber executives point to the vehicle’s design, which prevents it from stopping for “insignificant hazards” like plastic bags. Allegedly, the software design wasn’t tuned properly to recognize a full-sized human being as a hazard worth stopping for. Complicating matters was the vehicle’s backup driver, Rafeala Vasquez, who, as seen in the video, was distracted by her phone. Whether she will be facing repercussions has yet to be revealed.
#uber is attempting some legal footwork here: calling the very thing they are in charge of "tuning" and not the more recognizable "training"; also blame the "watcher" person who was supposed to… Not act like they were in a self-driving car???https://t.co/DC22l8rCQD
— Bombadil????Brin (@bjorn_daporn) May 9, 2018
I’m still stuck on the self driving Uber vehicles. One killed a woman in Phoenix, and their working on improving it… tell me again why we need self driving vehicles?
— ashley (@your_520coke) May 9, 2018
Herzberg was struck by the vehicle, which was going 38 mph in a 35 mph zone when it didn’t slow down or swerve. Initial reports considered whether or not there was a technical failure that kept the software from responding, but information released on Monday, May 7 indicated that the vehicle had acted according to its programming.
Wow. Tradeoffs in software configuration. Uber’s self-driving car saw the woman crossing the road, but then decided it was a false positive and ran into her. https://t.co/gScESefMJv
— Nicholas Thompson (@nxthompson) May 8, 2018
A self-driving Uber that killed a woman in Arizona recognized her as a pedestrian but didn’t attempt to avoid her because of a software setting. The software was set on "Hot or Not " a setting for airport pick up only. Uber said it was ugly for everyone involved.
— Matt Becker (@houdini357) May 8, 2018
I don't want to hear Uber and self driving in the same sentence. Get over it @Uber, it doesn't work. There's been too many accidents already.
— Key-Key Marie (@KEYspitz) May 9, 2018
Another report will soon follow from the National Transportation Safety Board, which sent its own team to Tempe, AZ to investigate Herzberg’s death. When pressed for questions, an Uber spokeswoman referred media outlets to the NTSB and followed up with an emailed statement stating NTSB Chairman Christopher Hart had been hired to review the company’s safety procedures.
Immediately following the accident, Uber suspended use of its self-driving operations.
#Uber is working to make a flying car by 2020…..the self driving car didn't work……so I know I won't be in a flying car.
— Dap!!! (@Deetroit_Dave) May 9, 2018
How about just pay people to drive. Spend trillions on self driving cars when you can just pay people. Elites are beyond greedy.
— 4409 (@Tweet4409) May 8, 2018