Statistically speaking, you’re far less likely to die in a Tesla than just about any other type of vehicle on the road. Without any safety features engaged, a Tesla’s overall likelihood of getting into an accident is about half of the national average. With safety features including Autopilot switched on, the chance drops to roughly 1/10th the average rate.

But Tesla crashes attract outsize attention, especially when Autopilot is involved. Case in point: Yesterday, the National Highway Traffic Safety Administration revealed that it had launched a formal investigation into Autopilot systems involving models Y, X, S and 3. The probe was prompted by 11 accidents, from 2018 through 2021, where Teslas “encountered first responder scenes and subsequently struck one or more vehicles involved with those scenes.” At least 17 people were injured in the accidents and one died.

Any new technology will, of course, have bugs. Identifying them and working through them is part of the process of innovation. But when looking at tech that people use to get from place to place, often at high speeds on crowded roadways, it probably makes sense for government officials to take a “better safe than sorry” approach. Here are some of the reasons why Tesla’s Autopilot may merit extra scrutiny. (The probe could lead to a recall of 765,000 vehicles.)

1. Drivers use the feature irresponsibly.
Legally speaking, drivers in all 50 states are required to be in control of their vehicles at all times, even if the car is equipped with a highly-advanced “assistance” system, like Autopilot. Despite this, there have been multiple instances where Tesla drivers have been caught lounging or napping in the back of their cars while speeding down the highway. Research has also shown that drivers, even while still at the wheel, tend to be more distracted when using Autopilot than without it.

Tesla CEO Elon Musk, a powerful media presence with 55 million Twitter followers, might deserve some of the blame for reckless use of assistance feature. “I think Autopilot is getting good enough that you won’t need to drive most of the time unless you really want to,” the billionaire once told comedian Joe Rogan during a podcast.

2. Tesla may not be doing enough to prevent this problem.
Another gripe of safety experts is that, for all the advanced semi-autonomous driving components, Tesla computers and sensors seem woefully inadequate at making sure operators’ rear ends stay in drivers’ seats and their eyes stay on the road. The current system detects pressure on the steering wheel to ensure drivers are using it and responding as necessary. But the mechanism is easy to “trick.”

“It’s very easy to bypass the steering pressure thing,” an electrical and computer engineering professor at Carnegie Mellon University told NPR. “It’s been going on since 2014. We’ve been discussing this for a long time now.”

3. Technology is great -- but a long way from perfect.
As Tesla describes on its website, the Autopilot system is equipped with 8 external cameras, 12 ultrasonic sensors, and a “powerful onboard computer” to help the vehicle maneuver around obstacles and stay in its lane. A “Full Self-Driving” feature is available for some models, and promises the ability to “get you to your destination more efficiently by actively guiding your car from on-ramp to off-ramp.”

Even in “Full Self-Driving” mode, however, drivers still must be “fully attentive,” Tesla cautions. The company also warns of many common factors that could limit the capabilities of the cameras and sensors, like rain, snow, fog, bright oncoming headlines, or bright sunlight, interference from certain types of electronic equipment, obstructions and debris, especially narrow or winding roads, and extreme temperatures.

“NHTSA reminds the public that no commercially available motor vehicles today are capable of driving themselves,” the agency said on Monday.

4. It should really be better at avoiding emergency scenes, at the very least.
Granted, the range of hazards and obstacles on roadways is virtually limitless, and it will take time before self-driving systems can fully adapt to them. But one would think flashing emergency signals and lights would be something systems would be designed to pick up on.

It seemed to be a point of special concern for the NHTSA that the 11 crashes it is examining all involved scenes with emergency vehicles and signs present. Accidents at issue occurred in San Diego; Miami; Lansing, Michigan; Montgomery County, Texas; Charlotte, North Carolina; Cochise County, Arizona; West Bridgewater, Massachusetts; Cloverdale, Indiana; Norwalk, Connecticut; Laguna Beach, California; and Culver City, California.

“Most incidents took place after dark and the crash scenes encountered included scene control measures such as first responder vehicle lights, flares, an illuminated arrow board, and road cones,” the agency said. “The involved subject vehicles were all confirmed to have been engaged in either Autopilot or Traffic Aware Cruise Control during the approach to the crashes.”

5. When Teslas DO crash, it’s bad. 
As electric vehicles, Teslas  rely on substantial lithium ion batteries that can be extremely dangerous when damaged. They start blazing hot infernos that are much more difficult for emergency workers to put out than fires in gasoline-powered cars. The charge carried by the batteries can also reignite the wrecks multiple times, hours or days after they appear to be extinguished.

One burning Tesla in April required about 28,000 gallons of water to put out, or roughly the same amount the suburban Houston fire department typically used in a month. Last year, the National Transportation Safety Board published a report calling out Tesla and other EV manufacturers for not providing enough information to emergency personnel to keep them safe when handling such wrecks.