clock menu more-arrow no yes mobile

Filed under:

It’s time to delete Uber from our cities

Uber’s self-driving program has never played by the rules, and now our safety is at risk

Cities need to say no to Uber’s self-driving program.
Uber

Uber’s self-driving program arrived in Arizona under odd circumstances.

About a week after the company’s self-driving pilot program launched on the streets of San Francisco in December 2016, the California Department of Motor Vehicles revoked Uber vehicles’ registrations because the company hadn’t filed a $150 permit.

Instead of following California law, Uber loaded its 16 self-driving cars onto a flatbed trailer and drove them to Arizona. As they were en route, the state’s governor, Doug Ducey, issued a statement: “Arizona welcomes Uber self-driving cars with open arms and wide open roads,” it read. “While California puts the brakes on innovation and change with more bureaucracy and more regulation, Arizona is paving the way for new technology and new businesses.”

Update, March 26, 9:20 p.m. ET: Arizona has suspended Uber from autonomous vehicle testing. In California, Uber is letting its self-driving permit expire on March 31.

The flagrant flouting of California’s regulations, the dramatic, professionally photographed exodus—it all seemed less like the actions of a responsible global corporation and more like a bratty kid yanking away his toys after picking a fight.

“It’s not about picking a fight,” Anthony Levandowski, Uber’s self-driving program director, said at the time. “It’s about doing the right thing. And we believe that bringing this tech to California is the right thing to do.”

A year later, Levandowski himself was accused of not doing the right thing. Levandowski, who had worked at Uber competitor Waymo when its self-driving operation was still part of Google (it’s now part of Alphabet), was named in a lawsuit for trying to steal Waymo’s technology.

The case was settled within a week—Uber agreed to give Waymo’s parent company Alphabet about $245 million in equity. But even more importantly, as Recode reported, the settlement came with a guarantee for Waymo—“that Uber won’t use their self-driving tech.”

The technology that Waymo claims Uber was trying to steal is the technology that makes its cars much safer than any other self-driving cars on the road:

One of the most powerful parts of our self-driving technology is our custom-built LIDAR — or “Light Detection and Ranging.” LIDAR works by bouncing millions of laser beams off surrounding objects and measuring how long it takes for the light to reflect, painting a 3D picture of the world. LIDAR is critical to detecting and measuring the shape, speed and movement of objects like cyclists, vehicles and pedestrians.

Now, however, the performance of Uber’s own proprietary tech, and how well it detected and measured the movement of a pedestrian—specifically a person walking a bike—is under intense scrutiny after one of Uber’s vehicles struck and killed Elaine Herzberg in Tempe, Arizona on Sunday night.

Uber has poisoned the well for a nascent industry that has largely done the right thing up until now.

The National Transportation Safety Board and National Highway Safety Administration are currently conducting a full investigation, and the results will likely take months. But just using the the basic information about the crash, including a dashcam video shared by Tempe police, most autonomous vehicle experts interviewed about the crash—by the Wall Street Journal, Wired, the Arizona Republic—agree that Uber’s self-driving system failed.

The technology not only failed, but Uber is also responsible for the death, writes The Drive’s Alex Roy in a very strongly worded and thorough examination of Uber’s culpability. “Even if you believe self-driving cars may someday reduce road fatalities—and I do believe that—this dashcam video is an icepick in the face of the argument that anyone at Uber gives a damn about anyone’s safety, including that of their own test drivers.”

The safety of Uber’s autonomous vehicle testing has been cited as a concern before.

Mere hours after Uber’s San Francisco self-driving trial began, The Verge reported that one of Uber’s cars ran a red light, nearly hitting a (human-driven) Lyft car. “Safety is our top priority,” commented a spokesperson, who told The Verge that the error was due to a human safety driver. Yet a New York Times investigation that looked at the vehicle’s internal logs found that Uber’s system failed to recognize the stop light—as well as several others.

Uber’s vehicles were then accused of driving into San Francisco’s bike lanes without warning. This was not the fault of human drivers but a known software error, Uber told The Verge. But rather than fix the software in time for its public launch, Uber had told its human safety drivers simply to take control of the vehicle when turning right in a street with a bike lane.

When a human driver has to take over from a self-driving car’s system, it’s called a “disengagement,” and it’s something that autonomous companies see as a last resort, as it can often be dangerous. California DMV records demonstrate that as self-driving programs log more on-road experience, they see fewer and fewer disengagements. Waymo, for example, now sees one disengagement per every 5,600 miles driven.

Uber isn't showing the same trends. In fact, evidence shows an aggressive push to bring its technology to market when it clearly wasn't ready.

For one, Uber has not publicly reported its own self-driving miles and disengagements. According to documents obtained by Recode last year, the vehicles are not driving enough to get the experience the software needs, resulting in a troubling number of disengagements: After driving 20,354 miles, Uber’s cars had to be taken over by human drivers at every mile. The New York Times obtained newer documents which showed Uber was trying to reach a goal of 13 miles per disengagement in Arizona.

Uber says it’s logged millions of self-driving miles, but without that disengagement data made public, cities have no idea if the technology is getting safer.

There are five major companies that recently started testing in Arizona—Uber, Waymo, Ford, General Motors, and Intel—all of which are also on the road testing in other cities.

But of those companies, none come close to the experience logged by Waymo, which recently reached a milestone of five million self-driven miles. Waymo—which has been testing its technology on California streets since 2009 (where it obtained all the proper permits) and is now in 20 other cities—launched public trials in Arizona in 2017, choosing Chrysler Pacifica minivans over SUVs is because Waymo’s pilot project focuses on families and people with disabilities. Earlier this month, Waymo even began conducting fully autonomous testing in Arizona without a human safety driver at all.

Waymo has reported dozens of fender-benders with its vehicles but only one at-fault collision: The car was going 2 mph and bumped into a bus. We know this because Waymo publishes monthly safety reports, including a comprehensive 43-page summary in 2017. Waymo also has software that it claims has been explicitly programmed to recognize cyclists. One of the videos that Waymo released (back when it was still part of Google) shows how one of its vehicles detected and stopped for a wrong-way cyclist.

In fact, Waymo built an entire city specifically to test interactions with humans who are not in vehicles. This includes regional roundabout designs frequented by cyclists and parallel parking schemes meant to mimic those in shopping and entertainment districts where people are getting in and out of cars. There’s even a shed full of props—like tricycles—that might be used by people on streets so that Waymo’s engineers can learn how to identify and avoid them.

Uber is probably doing some of those things, too. But the public largely doesn’t know about it. Uber is notoriously secretive about its operations—and especially about its self-driving division.

The list of Uber’s self-driving safety concerns doesn’t even include the many missteps of its ride-hailing program and internal turmoil that has forced engineers to leave the company.

And unlike other transportation network companies that have formed partnerships with cities, only last year did Uber begin to incrementally share its data.

Autonomous vehicle rules have yet to be developed at the federal level, but there are guidelines that have been developed from several transportation groups about how they should be safely deployed: The National Association of City Transportation Officials (NACTO) released a Blueprint for Autonomous Urbanism, and the Self-Driving Coalition for Safer Streets, of which Uber is a member, has its own policy recommendations, although the coalition has not commented on Uber’s fatal crash.

Unless there are more specific safety standards for testing set in place, how will we know about Uber’s near-misses? How many other incidents will there be where Uber’s system won’t see a red light, or a bike, or a kid walking with his parents?

In other cities where Uber’s autonomous program is operating, but currently suspended (like Pittsburgh, Toronto, and San Francisco, where it returned last May), officials may already know the answer to these questions. But we, the people who are using these streets, definitely don’t.

Autonomous vehicles hold tremendous promise for reducing traffic deaths, a public health crisis that cities are already working hard to address. Now Uber’s arrogance and blatant disregard for safety has set the industry back, and Uber needs to step aside so this technology can be properly developed by the dozens of companies that are acting in good faith.

Take the tiny autonomous minibus that scoots around downtown Las Vegas. The bus, developed by French companies Navya and Keolis, doesn’t even go faster than 15 mph because it travels in an area where the movements of pedestrians are prioritized. In fact, it’s so overly cautious that when a truck started backing up into it on its very first day of service, the bus just sat there while its front bumper got crunched. No one was hurt, engineers learned how to avoid this problem in the future, and the bus was back on its route two days later.

That’s the kind of do-no-harm autonomous vehicle testing that we should allow on our streets.

This story has been updated with an investigation by the New York Times.