Tesla Full Self-Driving was tested by third-party testing firm AMCI and the result was a ridiculous 13 miles between interventions.
One of the biggest concerns about Tesla’s Full Self-Driving (Supervised) program is that the automaker has always refused to share intervention data or any data about the program three years into it.
We have to rely on anecdotal evidence and crowdsourced data, which Tesla CEO Elon Musk gave some credibility to by commenting positively on it despite it being reasonably bad.
For example, the more than 14,000 miles of crowdsourced data for FSD v12.5.1 versions show 31 miles between disengagement and 174 miles between critical disengagement:
Now, we get some more independent testing from AMCI Testing, an independent automotive testing firm.
AMCI drove more than 1,000 miles on FSD 12.5.1 and 12.5.3 in a 2024 Model 3 Performance with Hardware 4 in different driving environments.
The results are even worse than most of the FSD critiques thought. AMCI is reporting over 75 interventions or once every 13 miles on average:
While impressive for a uniquely camera-based system, AMCI testing’s evaluation of Tesla FSD exposed how often human intervention was required for safe operation. In fact, our drivers had to intervene over 75 times during the evaluation; an average of once every 13 miles.
The company shared a video series of some of the most impressive moves by Tesla’s system and some of its fails:
After its testing, AMCI admits that Tesla’s FSD system is impressive, but it also warns that the awe that you get from first trying it can lead to complacency, which can be dangerous.
Guy Mangiamele, Director of AMCI Testing, explains:
“It’s undeniable that FSD 12.5.1 is impressive, for the vast array of human-like responses it does achieve, especially for a camera-based system. But its seeming infallibility in anyone’s first five minutes of FSD operation breeds a sense of awe that unavoidably leads to dangerous complacency. When drivers are operating with FSD engaged, driving with their hands in their laps or away from the steering wheel is incredibly dangerous. As you will see in the videos, the most critical moments of FSD miscalculation are split-second events that even professional drivers, operating with a test mindset, must focus on catching.”
Mangiamele adds:
“What’s most disconcerting and unpredictable is that you may watch FSD successfully negotiate a specific scenario many times – often on the same stretch of road or intersection – only to have it inexplicably fail the next time. Whether it’s a lack of computing power, an issue with buffering as the car gets ”behind” on calculations, or some small detail of surrounding assessment, it’s impossible to know. These failures are the most insidious. But there are also continuous failures of simple programming inadequacy, such as only starting lane changes toward a freeway exit a scant tenth of a mile before the exit itself, that handicaps the system, and casts doubt on the overall quality of its base programming.”
AMCI says that another series of videos from its first 1,000 miles on FSD will be released next week, and it plans to test future updates.
Earlier this week, I had my first drive of Tesla FSD v12.5.2, and I had similar results to those described by the company here.
FTC: We use income earning auto affiliate links. More.