7.1 C
New York
Friday, April 19, 2024

The Self-Driving Startup Teaching Cars to Talk

Horns honk. Hands wave. Lights flash. Fingers fly and eyes meet. This orchestra may seem a mess to anyone stuck in the pit at rush hour, but for the most part, it works. Humans may not excel as drivers when it comes to paying attention or keeping calm, but we’re masters of communication, even when stuck in our metal boxes.

Robots offer this resume in reverse: all-stars when it comes to defeating distraction, noobs when it comes to negotiating the human-filled environment. And for the folks aiming to deploy fleets of self-driving cars into that chaos, this is a problem.

“The question is how to replace the driver,” says Bijit Halder, the product and design lead for Drive.ai. The Silicon Valley-based startup just started a shuttle service in Frisco, Texas, connecting an office park to a nearby stadium and apartment complex. (It keeps a human in the driver seat, ready to take control if the robot falters.) That pilot project is the product of three years of development work by the company, which was founded by a group that came out of Stanford’s Artificial Intelligence Lab and now has more than 150 employees.

Drive.ai is counting on AI and machine learning expertise to teach its robots to drive, but from the start, it has focused on teaching them to communicate too. In places like Frisco, a suburb of Dallas, this technology is novel and (especially after Uber’s deadly crash in Arizona in March) can make people nervous. A successful service, the company believes, relies on making its customers as well as everyone else outside the vehicle confident in how it will behave. Confidence begets comfort. And confidence comes from communication.

So while the roboticists were writing code and running simulations, Halder and his team were poring over every detail of the Nissan NV200 vans Drive.ai runs in Frisco (the same model used for many New York City taxis, minus the glass partition) to make them easy to understand.

The bulk of the van is bright orange, making it easy to spot—the same thinking behind yellow school buses and red Ferraris. On the left and right sides, “Self-Driving Vehicle” is written in white over a ribbon of blue that pops against the orange, high enough that it’s easily spotted from inside a car. (The team went with “self-driving” over the techier “autonomous,” Halder says, because it’s the simpler term.) On the front of the car, it’s written on the bumper, low to the ground, where pedestrians crossing in front of the stopped vehicle are likely to look, to see if the wheels are starting to move. “We want to be cognizant of the context in which you see the car, and be responsive to it,” Halder says.

If that context includes you riding inside the thing, you get a 13-inch screen showing you the view from the car’s cameras, as well as what its lidar laser sensor sees. Drive.ai’s setup includes a thick red line showing the car’s planned trajectory for the next six seconds, an easy way to reassure a passenger the car won’t miss their turn, or that yes, it plans on stopping at that upcoming red light.

Drive.ai isn’t the only self-driving outfit paying attention to this sort of design, of course. Virtually every company in the space uses interior screens to communicate with their passengers. What sets Drive.ai’s approach apart is its use of screens on the van’s exterior. These four panels—each 22.5 by 7.5 inches, on the hood, on the rear, and just above each of the front wheels—are the vehicle’s voice. If the car comes to a stop to yield to a wary pedestrian, they flash, “Waiting for You,” alongside a graphic of a person in a crosswalk. For any drivers behind the vehicle who might wonder what the hold up is, the rear panel reads, “Pedestrian Crossing.” When a Drive.ai employee is working the wheel, the panels say “Person Driving.”

These messages are just the latest iterations in an ongoing churn, as Halder’s team tries one idea after another. They’ve mixed and matched colors, played with animation and still images, and tested different turns of phrase. Just in May, the pedestrian panel read, “Waiting for You to Cross,” with a small image of a person walking. In June, the team tried the terse “Waiting,” with a larger image, before arriving at the current “Waiting for You.”

“‘Waiting’ is not as clear,” Halder says. “If you talk to me, I respond better. It’s about communicating.” Thus, “Waiting for You.” In the same time frame, the human driving mode sign went from “Self-Driving Off” with an exclamation point in a yellow triangle, to “Human Driver” with a steering wheel icon, to “Person Driving” with a cartoon chauffeur.

>

The panels themselves have changed, too. When Drive.ai first showed off the concept in 2016, it slapped a single billboard-type display on the roof, slightly smaller than the panels it uses now. Realizing lots of people missed messages riding so high and that people on different sides of the car need different information, it moved to the current quartet in 2017.

The selecting force in this evolution is the user testing Drive.ai does, largely with focus groups, gauging who understands what, and asking for opinions. They show participants renderings of early stage ideas, and observe their behavior around the vehicle. “We have a lot of opinions and ideas. They are not worth anything unless the user says so,” Halder says. “Most of our design is driven by user feedback.”

Riders in Frisco are contributing their opinions too. One, Halder says, compared it to an amusement park ride (as in cool and exciting, not vomit-inducing). And when the designer asked passengers what else they’d like from the experience, a few requested music. The team hasn’t added that in just yet, but Halder is thrilled at the request. “They’re saying, ‘Hey, I’m comfortable. Now entertain me.’”

Related Articles

Latest Articles