You are here

Tesla’s Autopilot Vexes Some Drivers, Even Its Fans

After his Tesla Model S had driven itself for 17 straight minutes on Interstate 66 in Virginia last November, Carl Bennett, sitting in the driver’s seat, began looking at a printed document.

Seconds later, he glanced up and saw a truck parked in the road ahead. His car’s Autopilot technology didn’t react the way he expected, Mr. Bennett said. He slammed on the brakes, swerved and hit the truck. He wasn’t hurt, but the $106,000 electric car was totaled.

Mr. Bennett, a consultant who lives in Warrenton, Va., complained to Tesla Motors Inc.The auto maker replied that the crash was his fault, according to a letter from Tesla reviewed by The Wall Street Journal.

Tesla said the self-driving system worked exactly as it should have, citing data from the car’s “historical log file,” a document signed by Mr. Bennett, and an owner’s manual declaring the technology “cannot detect all objects and may not brake/decelerate for stationary vehicles.”

Interviews with drivers and engineers suggest that enthusiasm for autonomous driving has raced ahead of the technology’s capabilities, deepening concerns about road safety.

Tesla Motors Chief Executive Elon Musk speaking at the auto maker’s headquarters last October. Tesla’s self-driving system made its debut that month and is now on about 70,000 vehicles world-wide. PHOTO: DAVID PAUL MORRIS/BLOOMBERG NEWS

Since the Autopilot software made its debut in October, it has been linked to a handful of crashes in addition to a fatal one in Florida in May now under investigation by U.S. auto-safety regulators. That death was the first in a Tesla car using Autopilot, the company said, spurring a broader reckoning about self-driving technology.

Regulators said Wednesday they are "collecting information" about a Tesla Model X crash July 1 in Pennsylvania. Tesla said it has “no reason to believe Autopilot has anything to do with this accident” but has been unable to access certain data from the vehicle.

Two drivers told the Journal they believe Tesla’s technology failed to recognize a stopped vehicle. As a result, the drivers said, their vehicles kept going until it was too late.

Other users of Autopilot said it works so well that it has lulled them into potentially dangerous situations, such as falling asleep at the wheel and allowing the car to navigate construction zones.

“I look down at my phone a little more than I used to,” said Jason Hughes, a Hickory, N.C., computer programmer who uses the Autopilot system on his Tesla about 90% of the time. “People are overly confident in it, in my opinion. They think it can do magical things, but it can’t go beyond what its sensors tell it.”

In response to questions from the Journal, a Tesla spokesman said there have been several nonlethal accidents since the inception of Autopilot, now on about 70,000 vehicles world-wide. He said the accident rate among Tesla drivers who use the technology is far lower than those who don’t.

Still, the Palo Alto, Calif., auto maker has begun looking for ways to intervene when drivers misuse its automated-driving system.

Tesla might decide to temporarily disable Autopilot if a driver consistently doesn’t respond to a warning light or bell, and then fails to put his hands back on the wheel, a person familiar with the matter said.

This Tesla Model S crashed into an 18-wheel truck in Levy County, Fla., on May 7 while using the car’s self-driving system. The crash killed Tesla driver Joshua Brown of Canton, Ohio. PHOTO: ROBERT VANKAVELAAR/REUTERS

Tesla also is planning to update the Autopilot software in coming months but won’t make specific changes as a result of the fatal crash, the Tesla spokesman said.

Joshua Brown of Canton, Ohio, crashed into an 18-wheel truck on May 7. Florida Highway Patrol Sgt. Kim Montes said a portable DVD player was found in Mr. Brown’s Model S. Law-enforcement officials don’t know if the device was on at the time of the collision. The truck driver said in an interview Mr. Brown was watching a movie.

The auto maker and its chief executive, Elon Musk, have urged caution about using Autopilot while aggressively promoting its capabilities. Owner’s manuals state that the technology “is designed for your driving comfort and convenience and is not a collision warning or avoidance system.”

Tesla said Autopilot “is by far the most advanced such system on the road, but it does not turn a Tesla into an autonomous vehicle and does not allow the driver to abdicate responsibility.”

In March 2015, Mr. Musk told reporters the system could make it possible in the future for Tesla vehicles to drive themselves “from parking lot to parking lot.”

Tesla was testing Autopilot on a route between Seattle and San Francisco, Mr. Musk said, and the technology was almost capable of handling the entire trip without the driver touching any controls.

In April of this year, Mr. Musk retweeted a video by Mr. Brown that shows the Autopilot of his Model S preventing a crash.

The Tesla spokesman said Mr. Musk’s comments about the test vehicle’s capabilities conveyed his vision for the system, adding that customers are told explicitly what Autopilot is and isn’t capable of doing.

Tesla has said its technology is the most advanced system on the market, partly because drivers can download improvements directly to their vehicle.

Offerings from other auto makers include adaptive cruise control, automatic brakes and other features that perform some driving functions when activated. Google parent Alphabet Inc. is developing and testing technology aimed at driving cars without any human interaction.

Some auto makers have moved more slowly than Tesla. General Motors Co. in 2012 said it was testing a technology called “Super Cruise,” showing how a self-piloted Cadillac could allow drivers to sit back and relax. The driverless system was supposed to debut this year but has been delayed until 2017.

Volvo Car Corp. offers a hands-free option on its XC90 sport-utility vehicle but limits use to speeds of 30 mph or less. The cap will rise to 80 mph later this year in a new sedan called the S90.

Mobileye NV, a supplier of driverless equipment to Tesla and other auto makers, relies on a camera to detect cars, lane lines and other obstacles. Mobileye said its system isn’t designed to always detect vehicles cutting in front with a left turn, which occurred in the fatal Florida crash.

The company said its “lateral turn across path” function will be available in 2018.

Autopilot will usually detect such crossings, but in the crash that killed Mr. Brown, the system failed to distinguish the 18-wheeler’s white trailer from the bright sky, Tesla said last week.

If the system had detected the truck’s front or rear, it likely would have sounded alarms and slowed the car, according to Tesla.

Tesla said last week that Mr. Brown’s death was the first fatality in more than 130 million miles driven with Autopilot. In the U.S. overall, there was one death for every 89.3 million miles driven last year, according to a National Highway Traffic Safety Administration estimate.

Proponents of driverless cars say the technology has the potential to reduce traffic fatalities, which topped an estimated 35,000 in the U.S. last year. More than 90% of fatal crashes stem from human error, including drunken driving. “Driverless cars don’t drink,” said GM President Dan Ammann in an interview in May.

Chris Allessi II, a former car salesman from Greendale, Wis., said the technology saved him from hitting a texting bicyclist who pulled in front of him in May.

“I guarantee he wouldn’t have survived,” Mr. Allessi said.

Austin Meyer, a software entrepreneur in Columbia, S.C., said he typically uses Autopilot only when he wants to do something else while driving, such as search for a new radio station, look up directions or eat lunch.

“At that point, you’re probably safer with Autopilot because you’re going to be distracted anyway,” Mr. Meyer said.

The first time he let the car drive itself, Mr. Meyer said he had to quickly grab the wheel to stop his Tesla from running over a line of orange traffic cones. He later recorded a video of himself reading a newspaper while the car travels down a private road.

Switching from passive to active driving might be too complicated for some drivers, especially after a long period of inactivity, according to some driverless-car advocates.

“Human nature is such that if they’re not doing anything, they’re going to get distracted or drowsy and incapable of taking over,” said Dean Pomerleau, an adjunct professor at Carnegie Mellon University in Pittsburgh who has worked on self-driving technology for 25 years.

Mr. Bennett, whose Tesla hit a truck on I-66 in November, said a warning bell sounded too late for him to avoid the collision. He urged Tesla to highlight more forcefully that the technology might not detect some stationary vehicles and other objects, including motorcycles, cyclists and small animals.

Tesla responded that it was “sorry to hear of your accident and pleased that you are safe,” according to a December letter to Mr. Bennett. But the crash “was the result of driver error,” the company concluded after its review. “To an attentive driver, it would have been clear that the driver should immediately slow the vehicle to avoid the accident.”

According to Tesla, he pressed the brake pedal less than a second before hitting the truck and turned the steering wheel clockwise. Tesla said those moves “canceled the activation” of automatic emergency brakes.

The letter recommended Mr. Bennett, 45 years old, review his owner’s manual “to familiarize yourself with the specific conditions for the proper usage” of Autopilot.

Mr. Bennett said he accepts responsibility for the crash but remains dissatisfied with Autopilot’s performance and how Tesla informed drivers about its limitations. “Just make this more obvious to owners,” he said.

Arianna Simpson, a venture capitalist in San Francisco, said the Autopilot in her Model S “did absolutely nothing” when the car she was following on Interstate 5 near Los Angeles changed lanes, revealing another car parked on the highway.

Her Model S rammed into that car, she said, damaging both vehicles but causing no major injuries.

Tesla responded that the April crash was her fault because she hit the brakes right before the collision, disengaging Autopilot. Before that, the car sounded a collision warning as it should have, the car’s data show.

“So if you don’t brake, it’s your fault because you weren’t paying attention,” said Ms. Simpson, 25. “And if you do brake, it’s your fault because you were driving.”

She doesn’t expect to use Autopilot much once her Model S is repaired, partly because she thinks she would constantly second-guess the automated-driving system.

“When I have a bug on my app, it crashes. When I have a bug on my car, people die. There is a slightly different burden on the company,” Ms. Simpson said. “I don’t want to die, and I don’t want others to die.”

In response, Tesla said it has “continuously educated customers on the use of the feature, reminding them that they’re responsible for remaining alert and present when using Autopilot and must be prepared to take control at all times.”