clock menu more-arrow no yes mobile

Filed under:

How will driverless cars coexist with pedestrians?

New, 2 comments

New Silicon Valley startup Drive.ai seeks to create vehicle communication systems with personality, poise, and above all, the power to prevent accidents.

One of the main selling points of the government’s newly released policy proposals on driverless cars is how the rapidly advancing technology will make the roads safer and decrease pedestrian deaths. Last year, 35,200 Americans died due to road fatalities; with 94 percent of those resulting from human error or choice, robot drivers seem like a godsend in a nation where car usage continues to climb.

But even if we remove all the human drivers, automated vehicles still need to contend with pedestrians crossing our roadways, as well as cyclists and motorcyclists. Designing cars that not only communicate with each other, but can provide navigation and safety cues to those sharing the street, becomes paramount.

Drive.ai, a new Silicon Valley startup developing communication systems of driverless vehicles, focuses on improving human-robot interactions by making cars "learn" like their owners. Utilizing deep learning, an approach to artificial intelligence that would allow a vehicle to draw lessons from its previous experiences on the road, the company’s context based learning will help cars quickly master navigation and communication issues that have bedeviled designers. This important human-robot interaction is just beginning to develop, and cars need to learn some manners.

"If you hard-code the rules, that’s not the right strategy," says president and co-founder Carol Reiley. "Humans always break the rules. There’s a reason so many self-driving cars have been rear-ended on the road. They follow the rules too well, and don’t anticipate human behavior."

A vehicle with flashing lights and loud noises won’t necessarily solve the problem. It’s creating a car with a personality, says Reiley, one that has the language to send sophisticated signals about intent, as well as recognize intent in others. The working Drive.AI concept, a series of sensors and software, as well as a LED sign that would use audio signals and a pictorial language akin to emojis, accomplishes this task with a blend of verbal and nonverbal cues. And with deep learning, the system gets more sensitive and smart over time (other companies, such as Google, have previously focused on systems that hit the road will the complete rule book pre-installed).

The system, which is currently undergoing on-road tests with numerous vehicles, helps a car be Knight Rider on the inside, and Herbie on the outside. And, most importantly, it won’t stray into the uncanny valley of communications, and becoming so realistic that it’s off-putting. Personality doesn’t mean saying hello to a child crossing the street and scaring him or her.

"The number one thing is creating an intuitive and easy design that doesn’t require someone to be retrained to understand it," Reiley says. "That’s why emojis work so well. They may be hilarious, but there’s a reason they’ve become universal. They’re understandable pictures that quickly broadcast intent."

The challenge of crossing barriers and creating smart protocols is vast; on one hand, having cars use, say, a cute animal symbol to communicate with a child suggests profiling can have value. But if the car mistook Reiley, an Asian-American, for a Chinese speaker and warned her in Mandarin, it would create a communications breakdown. To devise a language for human-robot interactions, the Drive.ai team has scoured an array of fields, such as autism research, as well as how different members of the service industry stand out, to understand a full range of verbal and non-verbal means of getting a message across.

Urban interactions between pedestrians and automated cars present unique challenges for AI scientists. Many assume car-to-car technology, even for cars with human drivers, will make the roads much safer; we could have a roadway with no intersections or traffic lights, just smart cars coordinating and sharing the streets. But a future where automated cars will be alone is a long way off. Until then, unless they can also signal to cyclists sharing the road, there will be potential issues.

"Right now, we have no transparency," says Reiley. "We want to make sure that if it’s an autonomous car, cyclists know that we know they’re there. If the vehicle can’t see the cyclists, you’re going to cause an issue and accident."

Reiley believes the technology will evolve in steps. Maybe a few years after widespread commercial adoption, cities will set up autonomous only lanes to improve traffic flow and take advantage of computer-aided efficiencies. They might even set up an interesting means of creating a protected bike lane; if cars were programmed to stay a certain distance from cyclists, that might make for an even safer roadway.

As automated vehicle tech evolves, our cities will undergo a parallel evolution, shedding unneeded infrastructure and creating additional space for pedestrians, potentially jumpstarting a renaissance in car-free zones. Seamless messaging, from cars to smartphones and other wearable devices, may create new safety protocol. But that just ups the ante when it comes to designing better warning systems. For the next generation of vehicles on the road to function well in this environment, a certain degree of "emotional intelligence" and improved interaction needs to be programmed into the system.

"It’s about building trust and understanding how to interact," says Reiley. "It’s self-awareness, in the sense that these vehicles need to seem respectful and be able to engage with other cars as well as humans."