clock menu more-arrow no yes mobile

Filed under:

Uber’s fatal crash: Are self-driving tests endangering pedestrians? [updated]

New, 6 comments

New video shows the moment of impact from two perspectives

An Uber self-driving car drives down 5th Street on March 28, 2017 in San Francisco, California.
Photo by Justin Sullivan/Getty Images

Uber has suspended all autonomous testing after its self-driving vehicle struck and killed a pedestrian in Tempe, Arizona late Sunday night. The death raises new questions about who is liable for an autonomous crash—especially in this case, when a human “safety driver” was behind the wheel, and, as a new video shows, was not looking at the road.

Although there has been at least one death attributed to Tesla’s autopilot feature, the tragic death of 49-year-old Elaine Herzberg is considered to be the first time an autonomous vehicle has killed a person who was not a passenger in that vehicle.

A video released by Tempe police on Wednesday afternoon shows footage of the crash from two cameras in Uber’s vehicle. The dash camera shows a very dark, very wide street, with Herzberg walking her bike from the center median when she is struck. The perspective then shifts to the human safety driver, identified as Rafaela Vasquez, who is looking down and not at the street for much of the video.

The crash highlights potentially large gaps that still exist in federal AV policy and technological challenges like communication between vehicles and other users of the street.

According to a press conference held by Tempe police Monday afternoon, the self-driving 2017 Volvo XC-90 SUV was traveling in autonomous mode and showed no signs of slowing down when it hit Herzberg, who was pushing a bicycle across an eight-lane street near the intersection of Mill Avenue and Curry Road. A preliminary investigation by Tempe police noted the vehicle was traveling at 38 mph in a 35 mph zone, although the New York Times reported the speed as 40 mph in a 45 mph zone.

The safety driver was meant to take control of the vehicle in the event of an emergency. According to Sgt. Ronald Elcock of Tempe’s police department, she was not impaired, and the county attorney will determine whether to bring charges against Uber, the driver, or both.

Arizona has long touted its permissive regulatory framework for autonomous vehicles, which has made it a hotbed for testing the technology. Several companies, including Waymo’s driverless ride-hailing system, have recently launched high-profile pilot projects in the state.

In December 2016, the state’s Department of Transportation released a statement that touted the relative freedom of vehicle testing in Arizona due to the lack of state laws.

”Part of what makes Arizona an ideal place for Uber and other companies to test autonomous vehicle technology is that there are no special permits or licensing required,” the release said. “In Arizona, autonomous vehicles have the same registration requirements as any other vehicle, and nothing in state law prevents testing autonomous vehicles.”

Yet earlier this month, Arizona Governor Doug Ducey released an executive order to try bring more rules to the state’s AV testing regime. At the time of the order, 600 autonomous vehicles were operating in Arizona.

Ducey’s executive order also asked the Department of Public Safety (DPS) to issue a protocol around how law enforcement agencies handle fully autonomous vehicles—ones without a “safety driver”—in the event of a crash. According to a DPS spokesperson, the department is still working on a protocol for law enforcement interaction with autonomous vehicles, in partnership with the Arizona Department of Transportation, industry, and law enforcement stakeholders.

With such a high-tech vehicle involved in the incident, there will be plenty more data available to determine exactly what went wrong, why the car made an error, and ideally, what can be fixed to prevent it from happening again. If it was the sensors that indeed failed, the data should include information beyond what humans are able to see in a video.

The National Transportation Safety Board (NTSB) has sent a small team of investigators to Arizona to gather information about the Uber crash. According to a press release, the investigation will examine the “vehicle’s interaction with the environment, other vehicles and vulnerable road users such as pedestrians and bicyclists.”

There may also be an opportunity to change the way self-driving cars interact with pedestrians. Carol Reiley, co-founder of Drive.AI, a company developing systems to help driverless cars share the streets with pedestrians and cyclists, told Curbed that car designers need to create “a car with a personality”—one that has the language to send sophisticated signals about intent, as well as recognize intent in others.

”Humans always break the rules,” said Reiley. “There’s a reason so many self-driving cars have been rear-ended on the road. They follow the rules too well, and don’t anticipate human behavior.”

Uber’s self-driving technology was already noted to have trouble anticipating the movements of cyclists in San Francisco. When asked about yesterday’s Uber crash, Reiley had no comment.

In a statement about the crash released earlier today, National Association of City Transportation Officials (NACTO) executive director Linda Bailey echoed this sentiment that AV technology must be able to safely interact with all users of the street, no matter what the conditions: ”People on bikes, on foot, or exiting a parked car on the street, in or out of the crosswalk, at any time of day or night.”

Last year, NACTO released a Blueprint for Autonomous Urbanism which encourages cities to use autonomous vehicles that travel no faster than 25 mph as a tool for making streets safer and more accessible. But NACTO also recommends that the vehicles be fully autonomous, with no opportunity for human intervention, which has been deemed safer by several groups, like the Self-Driving Coalition for Safer Streets.

“Responsible companies should support a safety standard and call for others to meet one as well,” said Bailey. “We cannot afford for companies’ race-to-market to become a race-to-the-bottom for safety.”

This story was first published on March 18 and has been updated with new information.