8.4 C
New York
Thursday, March 28, 2024

People Have Got to Stop Confusing Their Teslas for Self-Driving Cars

We are entering a dangerous period in the development of self-driving cars. Today, you can buy a Cadillac, Volvo, Tesla, Audi, and even a Nissan that will do some of your driving for you, as long as you stay on top of things. It's all part of the steady trek toward the fully autonomous vehicles that will let you fully check out, and catch an in-car movie or two on your way to wherever you're going.

But we’re not there yet, and a growing body of evidence shows that these partially autonomous systems are lulling drivers into a false sense of security. In one example after another, it's clear too many people don’t get, or ignore the limitations of these robocars-in-training. They zone out, look away, even fall asleep. And they cause crashes.

Automakers anticipated such problems, and have tried to respond with systems that keep drivers focused and aware of their responsibilities, even when their hands are off the wheel and feet are nowhere near the pedals. But two recent incidents, each involving a Tesla, have thrown the problem back into the headlines, and those solutions into doubt.

On Monday, a Tesla Model S smashed into a stopped firetruck that had responded to an accident on the freeway in Culver City, California. The car buried itself under the rear of the truck, crumpling the hood to less than a third of its original length and folding it over the windshield. Nobody was hurt. According to the fire department, the driver claimed that the car was “on autopilot”. The National Transportation Safety Board is now reportedly considering an investigation into the crash.

Twitter content

This content can also be viewed on the site it originates from.

Over the weekend, a driver in another Tesla Model S sedan was arrested and charged with a DUI when he was found passed out behind the wheel on San Francisco's Bay Bridge. His blood alcohol content was two times the legal limit. He told the California Highway Patrol officers it was OK: The car was on autopilot.

Twitter content

This content can also be viewed on the site it originates from.

Tesla’s response to both hiccups is to reiterate what the car tells drivers before they can engage Autopilot for the first time: It is their job to stay fully attentive, and to keep their hands on the wheel, at all times. A spokesperson pointed out that the owner's manual reads, “Autosteer is not designed to, and will not, steer Model S around objects partially or completely in the driving lane,” which may explain why the car hit the firetruck. As for the drunk guy, if the car detects your hands aren't on the wheel, it will beep at you. If you still don't touch the wheel, the car will put on its hazards and slow to a stop, figuring that's better than rolling along with no human supervisor. (Most of these systems do the same thing.) Great if you have a heart attack. Not great if you're trying to get away with driving over the limit.

Both drivers seemed to expect more from their cars than they can actually deliver. And that's understandable, since they give the impression of being self-driving for mile after mile of uneventful highway driving. And the better they work, the more drivers will trust them, to the point where it's easy to forget that they're not infallible. That's a serious problem, because these controls are not meant to be relied on.

“I think there’s a broad misunderstanding about semi-autonomous versus autonomous,” says Aaron Ames, an engineering professor at Caltech’s Center for Autonomous Systems and Technologies. “This will only grow over time.”

That's because lots of automakers are boarding the semi-self-driving train, eager to offer customers the chance to pay extra for a swanky new feature. There's Tesla's Autopilot, Cadillac's Super Cruise, Audi’s Traffic Jam Pilot, Nissan’s ProPilot Assist. Mercedes and Infiniti have similar systems. They all work by combining existing safety features—mostly adaptive cruise control, lane keep assist, and automatic emergency braking—and use cameras and radars to stay in their lane and a safe distance from other vehicles.

You can see why manufacturers are tempted to slap simple, catchy names on these collections of kit: They're a lot easier to sell that way. And you can understand why drivers buying a car with a function called "pilot" think the robot can do all the work. But it can't. All these systems rely on the human paying attention, ready to jump in in case the car encounters something it can't handle on its own, like the sudden disappearance of lane lines.

They have different ways of checking for that attention. Tesla requires the driver touch the wheel occasionally, to prove they’re there and awake. Cadillac allows drivers to go hands free, but uses an infrared camera to make sure they keep their face pointed at the road. And they have different ways of getting the driver to focus: They beep and flash warnings mostly. Audi's system will tighten your seat belt and tap the brakes, just to get your attention.

All of this is not to say these systems don’t work. They do, and they increase safety significantly. The Volvo XC60 SUV was awarded first place in the UK’s What Car? safety awards on Tuesday night. Not only did it ace the crash test, said judge Matthew Avery, of Thatcham Research. "It is also bursting at the seams with safety technology to avoid the crash happening at all."

Thatcham says automatic emergency braking alone cuts rear-end crashes by 38 percent. The National Highway Traffic Safety Administration’s report into a fatal crash between a Tesla with Autopilot engaged and a truck, in Florida in 2016, concluded that crash rates dropped 40 percent after the Autosteer funtion was added.

Yet a National Transportation Safety Board report into the same crash was less forgiving, putting some of the blame on Tesla, for selling a system that allowed the driver, Joshua Brown, to ignore warnings and abuse the system to let the car effectively drive itself.

The answer to this growing problem is twofold. First, people must learn to be better drivers, and not abuse these fledgling technologies. “Anytime people are given a system, they’ll try to exploit it. Then the car companies will say ‘That isn’t what it was designed for,’ and then we’re at an impasse,” says Ames.

And automakers need better ways to teach their customers that lesson. Better yet, they need to design a system that can't be so easily abused. Because, as Elon Musk once said, "Any product that needs a manual to work is broken."

Related Articles

Latest Articles