Like with other crashes involving Teslas, a fatal accident in April in Houston prompted questions from safety officials about whether the car’s semi-autonomous Autopilot feature was involved.

This time the concerns were especially pointed. That’s because when emergency personnel examined the charred wreck of the 2019 Model S, they didn’t find anyone in the driver’s seat.

Reports of the crash of an apparently driverless Tesla sparked a sell-off of the automaker’s shares, causing CEO Elon Musk’s net worth to plunge by $5.6 billion. The accident, which killed the car’s 59-year-old owner and his 69-year-old friend, was not something the company could shrug off.  

Now Autopilot issues have surfaced for Tesla again in a big way. On Saturday, the Wall Street Journal reported that the company was recalling 285,000 vehicles in China to deal with safety concerns involving the system. 

Although it may be paving the way for the future of driving, Autopilot has raised some alarm in the U.S. over whether it is being used safely. Last month, as more details about the Houston incident have surfaced, the mystery has only deepened. Musk tweeted that that the “data logs” for the car showed that Autopilot wasn’t being used at the time of the crash, and the owner hadn’t purchased a “full self-driving” feature.

A preliminary investigation by the National Transportation Safety Board appeared to lean toward Musk’s assertion, though was inconclusive. Switching on Autopilot requires both a vehicle’s Traffic Aware Cruise Control and Autosteer features to be engaged, and the latter wasn’t available on the residential street where the accident occurred -- because it lacked lane markings.

Yet the circumstances still looked suspicious. The two men were overheard by their wives talking about using the vehicle’s Autopilot system just minutes before they got into the car on the evening of April 17, according to police. A home security camera showed the owner getting into the driver’s seat. Then the Model S sedan went just 550 feet before, at a speed well over the 30mph limit, it drove over a curb, smashed into a drainage culvert, manhole and a tree, and then burst into flames.

After the fire was put out, all that was left of the car was a pile of charred metal. The remains of one man were found in the passenger seat; the other in the rear passenger seat. Investigators are still looking into whether seatbelts were used, or if the car's owner could have been thrown from the driver's seat to the passenger seat.

Open-and-shut case, it was not, especially in light of an increasing number of stories about drivers misusing Autopilot by taking a snooze or playing games while behind the wheel, or even climbing into the back seat while the vehicle is on the road. One TikTok influencer, for instance, recently posted a video of himself wrapped up in blankets with his eyes closed while his Tesla cruised, without a human in control, down the highway.

And just this week, a 25-year-old man was charged with reckless driving after he was caught in the backseat of his Model 3 while it was traveling down Interstate 80 in the San Francisco Bay area. No one was at the wheel. 

Is Autopilot a misnomer?

Tesla has taken heat in the past over the name of Autopilot, which seems to suggest that drivers don’t have to remain vigilant behind the wheel and can let the vehicle’s computers take over. But Autopilot is not a fully autonomous system. The company describes it as an “advanced driver assistance system” which includes cameras, radar, sensors and a powerful computer.

A package of additional features called Full Self-Driving Capability is available, which can navigate and steer around obstacles. But even with those enhancements the system is “intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment.”

Musk, himself, though has offered mixed messages on the subject. “I think Autopilot is getting good enough that you won’t need to drive most of the time unless you really want to,” the billionaire and recent Saturday Night Live host once told comedian Joe Rogan during a podcast. Unsurprisingly, research has shown that drivers tend to be more distracted when using Autopilot than without, even if it’s only the semi-autonomous version.

Tricking the car into thinking you’re driving

Sensors are supposed to ensure that Teslas have people behind the wheel, even when Autopilot is engaged. But Consumer Reports found in April, just a few days after the deadly Texas crash, that the system is easy to “trick.” Using a Tesla Model Y, a test driver working for the magazine was able to stop the car, move over to the passenger seat, and get the car to drive again solely via Autopilot.

The engineer “fooled” the sensors by attaching a weighted chain to the steering wheel, mimicking the feel of a person’s hand. No warning signals were emitted by the car, and the Autopilot feature remained engaged.

“It was a bit frightening when we realized how easy it was to defeat the safeguards, which we proved were clearly insufficient,” the test driver told Consumer Reports.

Other Autopilot-related crashes

The Houston crash was hardly the first fatal Tesla wreck potentially linked to a driver relying too heavily on Autopilot. In 2017, a Model X in Mountain View, California plowed directly into a concrete barrier at 70mph, killing the driver. Safety officials discovered he was playing a game on his iPhone at the time. A similar incident happened in 2018, also in California, while other deadly crashes blamed on Autopilot have happened in Florida and Arizona.

For years, the NTSB has been calling for Tesla to include more tools to monitor drivers who use Autopilot, to keep them alert and their eyes on the road. So far that doesn’t seem to have happened.

Ad placeholder