US ends probe of Tesla fatal crash without seeking recall

US ends probe of fatal Tesla crash without recall
In this photo provided by the National Transportation Safety Board via the Florida Highway Patrol, a Tesla Model S that was being driven by Joshua Brown, who was killed when the Tesla sedan crashed while in self-driving mode on May 7, 2016. A source tells The Associated Press that U.S. safety regulators are ending an investigation into a fatal crash involving electric car maker Tesla Motors' Autopilot system without a recall. The National Highway Traffic Safety Administration scheduled a call Thursday, Jan. 19, 2017, about the investigation. (NTSB via Florida Highway Patrol via AP, File)

U.S. safety regulators have closed an investigation into a fatal crash involving electric car maker Tesla Motors' Autopilot system without seeking a recall, but they criticized the way the company markets the semi-autonomous driving feature.

The National Highway Traffic Safety Administration found that the system had no safety defects at the time of the May 7 crash in Florida, and that it was primarily designed to prevent rear-end collisions, spokesman Bryan Thomas said Thursday. Investigators also reviewed a crash on the Pennsylvania Turnpike in which two people were injured, as well as dozens of other crashes involving Autopilot in which air bags were deployed, Thomas said.

Tesla won't be fined, but the agency criticized Tesla for calling the system Autopilot. It found there was no safety defect at the time of the May crash, Thomas said.

The probe began June 28, nearly two months after a driver using Autopilot in a Tesla Model S died when it failed to spot a tractor-trailer crossing the car's path on a highway in Williston, Florida, near Gainesville.

Tesla's Autopilot system uses cameras, radar and computers to detect objects and automatically brake if the car is about to hit something. It also can steer the car to keep it centered in its lane. The company said that before Autopilot can be used, drivers must acknowledge that it's an "assist feature" that requires both hands on the wheel at all times. Drivers also must be prepared to take over at any time, Tesla has said.

The lack of a recall is good news for Tesla because the agency is either blaming the crash on human error or it doesn't see the recall as necessary because Tesla software updates have already addressed the problem, said Karl Brauer, executive publisher of Kelley Blue Book.

"Either one reflects well on Tesla," he said.

But the agency's findings are likely to influence how automakers market semi-autonomous systems. Just about every auto company has or is working on similar systems as they move toward self-driving cars.

The May 7 crash killed former Navy Seal Joshua Brown, 40, of Canton, Ohio. Tesla, which collects data from its cars via the internet, said at the time that the cameras on Brown's Model S sedan failed to distinguish the white side of a turning tractor-trailer from a brightly lit sky and that neither the car nor Brown applied the brakes.

Scenarios in which another vehicle crossed the sedan's path were beyond the capabilities of the system, Thomas said Thursday on a conference call.


The closure of the investigation without a recall "helps clarify that cars are still supposed to be driven by attentive people, and if people behind the wheel aren't attentive, it's not the technology's fault," Brauer said. That will help avoid the stigma that the technology causes accidents, he said.

Thomas highlighted two conclusions from the investigation. First, that advanced automated driving systems still require "continual and full attention of a driver" who should be prepared to take action. And second that manufacturers need to pay attention to how drivers actually use the technology, not just how they're supposed to use it, and to design their vehicles "with the inattentive driver in mind."

Tesla said in a statement that it appreciated NHTSA's thoroughness in reaching its conclusion.

In July, investigators asked Tesla for information on how Autopilot works at intersections with crossing traffic. They also asked Tesla to describe how the system detects "compromised or degraded" signals from cameras and other sensors and how such problems are communicated to drivers.

When Tesla released Autopilot in the fall of 2015, some safety advocates questioned whether the Palo Alto, California-based company and NHTSA allowed the public access to the system before testing was finished. The company acknowledged "beta testing" the system on cars driving on public roads.

Consumer Reports magazine called on Tesla to drop the "Autopilot" name because it can give drivers too much trust in their car's ability to drive itself. The influential magazine urged Tesla to disconnect the automatic steering system until it's updated to make sure a driver's hands stay on the wheel at all times.

In September, Tesla updated Autopilot software to rely more on radar sensors and less on cameras. The update also disabled the automatic steering if drivers don't keep both hands on the wheel.

Explore further: Tesla: Removal of 'Autopilot' from Chinese site a mistake