UBER HALTS ALL TESTING AND PULLS ALL AUTONOMOUS VEHICLES OFF OF THE STREETS
Sunday evening a pedestrian who has been identified as 49-year-old Elaine Herzberg, of Tempe, Arizona was struck and killed by an autonomous vehicle belonging to transportation giant, Uber. This marks the first fatal accident involving a self-driving vehicle and a pedestrian in the United States.
Tempe police say that the self-driving car was in autonomous mode at the time of the crash and that the vehicle hit the woman, who was walking outside of the crosswalk. The victim is reported to have died at a local hospital as a direct result of injuries sustained in the collision.
Uber’s CEO Dara Khosrowshahi said in a statement that “We’re thinking of the victim’s family as we work with local law enforcement to understand what happened.” The National Transportation Safety Board is expected to launch an investigation into the incident.
The car is only one of Uber’s autonomous vehicle fleet. The transportation giant has been testing self-driving cars in Arizona, Pittsburgh, and Toronto. The vehicles, which are going through a testing phase do have a driver behind the wheel as a safety precaution but the vehicle, however; was engaged in self-driving mode at the time of the incident.
Uber said it was pausing its self-driving car operations in Phoenix, Pittsburgh, San Francisco and Toronto and has pulled all of their autonomous vehicles off the street and halted all testing for the time being.
There have been incidents before with self-driving vehicles getting into accidents but no one was injured in the past and it had never been the car’s fault before.
Tesla Motors was the first to report a death involving a self-driving car in 2016 when the sensors of a Model S driving on autopilot failed to identify an 18-wheel tractor-trailer crossing the highway. The car drove underneath the trailer at full speed, killing the 40-year-old behind the wheel of the Tesla.
In one recent incident, California police officers discovered a Tesla vehicle that was stopped in the middle of a five-lane highway. The driver was asleep behind the wheel. The driver told authorities that the vehicle was on “auto-pilot,” before he was arrested on suspicion of driving under the influence.
In another recent case, an autonomous Tesla vehicle rear-ended a fire truck on a freeway. Again the driver told authorities that the car was in autopilot mode at the time of the collision.
John M Simpson, privacy and technology project director with Consumer Watchdog, said the collision highlighted the need for tighter regulations of the fledgling technology.
Simpson said he was unaware of any previous fatal crashes involving an autonomous vehicle and a pedestrian. “The technology is not ready for it yet, and this just sadly proves it,” said Simpson, adding that. “The robot cars cannot accurately predict human behavior, and the real problem comes in the interaction between humans, and the robot vehicles.”