9.1 C
New York
Thursday, March 28, 2024

After the Fatal Crash, Uber Revamps Its Robo-Car Testing

In the four months since an Uber self-driving car struck and killed a woman in Arizona, the ride-hail company’s autonomous vehicle tech has stayed off public roads. The governor of that state banned Uber from testing there; the company let its autonomous vehicle testing permit lapse in California; it pulled its vehicles off the streets of Pittsburgh, home to its self-driving R&D center.

Until today, when self-driving chief Eric Meyhofer announced in a blog post that Uber would return its self-driving cars to the roads in Pittsburgh. With a catch. For now, the vehicles will stay in manual (human-driven) mode, simply collecting data for training and mapping purposes. To prep for the tech’s return to the public space, Uber has undertaken a wholesale “safety review”, with the help of former National Transportation Safety Board chair and aviation expert Christopher Hart.

The broader impact of that review—whether it can put this tech back on the road while preventing the sort of crash that killed Elaine Herzberg—remains to be seen. But already, Uber has addressed one key piece of its robotic technology: the humans who help it learn.

When the National Transportation Safety Board released its preliminary report on the Uber crash in May, it noted that the company’s self-driving software had not properly recognized Herzberg as she crossed the road. But it also noted that Uber’s system relied on a a perfectly attentive operator to intervene if the system got something wrong. As far back as World War II, those who study human-machine interactions have said this kind of reliance is a mistake. People just can’t stay that alert for long periods of time.

This is a problem for self-driving car developers, who believe testing on public roads is the only way to expose their tech to all the strange, haphazard things that happen there. But testing imperfect robots among the living means relying on flesh-and-blood babysitters to take the wheel.

>

The changes Uber announced today, though still light on details, focus on that attention issue, and seem to bring the company up to speed with the standards of the industry, safety driver-wise. Two weeks ago, it laid off all its safety drivers—over 100 in San Francisco and Pittsburgh—and began hiring around for a new “mission specialist” role.

Now, rather than depend on a single operator to both monitor the road and the AV technology, as it did in the months leading to the crash, Uber will put two “mission specialists” in each car. (Some of these new roles were filled by old safety drivers, who were invited to re-apply for the positions.) One will sit behind the wheel and monitor the road, and one will sit in the passenger seat and make notes about the environment the software’s operations. Other companies, like GM’s Cruise and Nutonomy, also test with two operators in each vehicle.

Uber has also added a driver monitoring system into its testing vehicle. No longer will the company rely on stern warnings and good faith to ensure that its operators are paying perfect attention to the road. Instead, Uber will use a driver-facing camera to monitor the position of the operator behind the wheel’s head.

Uber says the software-enabled camera should be able to tell whether the driver’s head is tipped down to look at something like, say, a phone, or turned to, say, rubber-neck. If the system detects that the driver has stopped looking at the road, it will emit a loud beep, a system similar to that used by General Motors’ semiautonomous Super Cruise feature. (Whether humans can properly snap to attention after that sort of warning and orient themselves enough to prevent a crash is still a matter of debate.) Meanwhile, a remote mission specialist will receive an alert that a driver isn’t being sufficiently attentive, and can tune into a live feed of what’s happening inside the car. That specialist should be able to communicate via laptop with the specialist in the passenger seat if anything has gone especially awry.

Uber says it has also retooled the central console tablet inside its vehicles, the sort of screen that safe driving experts say can be dangerously distracting for those behind the wheel. Because the safety driver will no longer be charged with monitoring the self-driving technology, the interface will mostly just show the turn-by-turn navigation system. Uber declined to share specifics about changes to the tablet’s interface.

“This is a a responsible, reasonable move to fall closer in line with others who are testing in the area,” says Bryan Reimer, who studies human-machine interaction at MIT. The head monitoring system is “a major step in the right direction,” he says, but notes Uber should also consider an eye-tracking setup. He and his colleagues have found that what a driver’s eyes are doing—staring at the horizon versus scanning the road—is the best predictor of whether they are actually paying attention.

It’s a weird irony: As they work to get rid of human drivers forever, autonomous vehicle developers first need a thorough understanding of how humans drive.

Related Articles

Latest Articles