Autopilot technology drives Teslas but comes with warnings

  
Autopilot technology drives Teslas but comes with warnings
In this Sept. 29, 2015 file photo, Elon Musk, CEO of Tesla Motors Inc., introduces the Model X car at the company's headquarters in Fremont, Calif. A Tesla in Autopilot mode can drive itself but it's not a "self-driving" vehicle, at least as far as safety regulators are concerned. So, instead of coming under heavy government scrutiny before being sold to the public, Tesla can mass-produce cars that automatically adjust speed with the flow of traffic, keep their lane and slam the brakes in an emergency. (AP Photo/Marcio Jose Sanchez, File)

A Tesla in Autopilot mode can drive itself but it's not a "self-driving" vehicle, at least as far as safety regulators are concerned.

So, instead of coming under heavy government scrutiny before being sold to the public, Tesla can mass-produce cars that automatically adjust speed with the flow of traffic, keep their lane and slam the brakes in an emergency.

Tesla tells its customers to stay alert while driving, only use the technology on divided highways, keep their hands on the wheel and be prepared to take over should the technology fail. Some clearly don't—online videos, including some with the "driver" in the back seat, show people taking the very risks Tesla warns against.

Still, the disclaimers—and a few regulatory wrinkles—are enough for the government.

Tesla made sure of that before going to market with the technology in October, approaching the Department of Motor Vehicles in its home state of California to check whether officials would throw up any roadblocks.

The department was neck deep writing rules for "autonomous vehicles" that one day will be able to drive themselves without human control. But California's DMV had no authority over Tesla's Autopilot, which was not "autonomous" because of the need for human backup.

If officials at the National Highway Traffic Safety Administration had any safety concerns about Autopilot, they too had no basis for tapping the brakes on the technology before its debut.

It's not a loophole—it's the way automobile regulation works in the United States. Automakers can add what they call an advanced driver-assistance system, such as Autopilot's lane-keeping, as long as the technology meets broad federal safety requirements. It's only if there are problems once the technology is on the road that regulators swoop in.

That is happening now as NHTSA investigates whether Autopilot has a defect that failed to prevent a fatal crash in Florida in May. The driver, an Autopilot enthusiast, was killed when his Model S did not detect a truck that had turned left across oncoming traffic on a divided highway.

NHTSA's reactive approach is the opposite of how the Federal Aviation Administration treats autopilot that commercial airplane pilots rely on for most flights.


"We wouldn't dream of putting a new automated technology on a plane without testing it first to the FAA's satisfaction," said Missy Cummings, who as director of Duke University's Humans and Autonomy Lab has studied the limitations of machine-aided operation in planes and cars.

In a nod to those concerns, U.S. Transportation Secretary Anthony Foxx said Tuesday that government regulators and the auto industry need to engage in a more rigorous review of self-driving technology before it enters the marketplace to assure consumers it is "stress-tested."

After testing Autopilot for about a year, Tesla CEO Elon Musk unveiled the system last fall with characteristic flair. He played up the technology but also cautioned drivers "to keep their hands on the wheel just in case" because "the software is very new." Since the Florida crash and subsequent federal investigation, Musk has said Tesla is working to improve Autopilot.

Tesla is not the only high-end automaker that has introduced advanced driver-assistance systems. Mercedes sedans, for example, can keep their lanes. Audi plans to introduce what it says will be the most advanced self-driving system on the market in its A7 sedan in 2018, though the technology will be limited to low-speed, commuter-style traffic at first.

Tesla is, in the parlance of Silicon Valley, a disruptor of established automakers. Traditionally, a company tricks out test cars with a new technology and recruits hundreds of research assistants and other people familiar with the system's limitations to test drive them to understand unanticipated ways they crash, said Jim Sayer, a research scientist at the University of Michigan's Transportation Research Institute.

Recognizing the side of a truck—the deadly circumstances in Florida—is a common crash scenario that "should have been thoroughly tested," he said.

In a blog post Wednesday, Musk defended the company's decision to market partial self-driving technology now rather than waiting until Autopilot has greater road test experience that might reveal other problems not anticipated by engineers.

"When used correctly," he wrote, "it is already significantly safer than a person driving by themselves and it would therefore be morally reprehensible to delay release simply for fear of bad press or some mercantile calculation of legal liability."

By enlisting its fiercely loyal owners as test drivers, Tesla says it has almost 90,000 cars or SUVs gathering Autopilot data, giving the company a tremendous competitive advantage because driving data lets Tesla improve the technology with software updates beamed directly to cars.

Tesla's fleet has driven a combined 140 million miles on Autopilot since October. By contrast, the company that has done by far the most extensive testing of truly self-driving car prototypes—Alphabet's Google—reports driving 1.7 million autonomous miles in several dozen prototypes since 2009. Trained safety drivers are behind the wheel, paid to stay alert.

The Google car is trying to solve a much harder problem—how a car can drive itself without even a steering wheel for a person to grab. The mileage gap illustrates how fast out the gate Tesla has been.

The owner-as-test-driver model is unprecedented in the auto industry, according to Daniel V. McGehee, an expert in auto crashes and advanced safety technology at the University of Iowa.

"It's an interesting psychology," McGehee said, "and it's been useful to them."

Explore further: US probes second suspected Tesla Autopilot crash