Tesla’s 'Full Self-Driving' Can Only Travel 13 Miles Without A Driver Stepping In

Tesla’s 'Full Self-Driving' Can Only Travel 13 Miles Without A Driver Stepping In

For years, Tesla has proudly paraded its advanced driver assistance system, Full Self-Driving as being the real deal. It’s claimed the system can navigate traffic, manage highway driving and repeatedly claimed it’s the future of driving, despite the number of crashes, collisions and even deaths linked to the system mounting. Now, a new study has looked into just how far the system can actually drive before needing assistance from a human, and it’s not very far.

Tesla Had A Very Interesting Week

Automotive research company AMCI Testing wanted to find out just where the limits of Full Self-Driving lay, so it set out to cover more than 1,000 miles on the streets of California, reports Ars Technica. While undertaking the driving, its researchers had to step in and take the wheel from the Tesla system more than 75 times.

Safety drivers riding in the Full Self-Drive equipped Teslas had to take control of the car almost every 13 miles, reports Ars Technica, due to run-ins with red lights and, in some instances, cars coming in the other direction. As the site reports:

The dangerous behavior encountered by AMCI included driving through a red light and crossing over into the oncoming lane on a curvy road while another car was headed toward the Tesla. Making matters worse, FSD’s behavior proved unpredictable—perhaps a consequence of Tesla’s reliance on the probabilistic black box that is machine learning?

“Whether it’s a lack of computing power, an issue with buffering as the car gets “behind” on calculations, or some small detail of surrounding assessment, it’s impossible to know. These failures are the most insidious. But there are also continuous failures of simple programming inadequacy, such as only starting lane changes toward a freeway exit a scant tenth of a mile before the exit itself, that handicaps the system and casts doubt on the overall quality of its base programming,” Mangiamele said.

See also  Fisker Alaska electric pickup is one of the coolest Cybertruck and F-150 Lightning rivals yet

Those shortcomings with Autopilot and FSD have been well-documented, with owners reporting that their Teslas have failed to recognize everything from rail crossings to parked police cars. In some instances, the issues FSD has when it comes to recognizing obstacles and hazards in the road has led to crashes.

For a camera-based system, the testers were impressed. Photo: Salwan Georges/The Washington Post (Getty Images)

However, AMCI is keen to point out how far the system has come in recent years, as Electrek reports. The research firm said that anyone getting in a FSD-enabled Tesla for the first time is sure to be hit with a “sense of awe” on first impression, which could then lead to issues further down the road, as Electrek reports:

Guy Mangiamele, Director of AMCI Testing, explains: “It’s undeniable that FSD 12.5.1 is impressive, for the vast array of human-like responses it does achieve, especially for a camera-based system. But its seeming infallibility in anyone’s first five minutes of FSD operation breeds a sense of awe that unavoidably leads to dangerous complacency.

When drivers are operating with FSD engaged, driving with their hands in their laps or away from the steering wheel is incredibly dangerous. As you will see in the videos, the most critical moments of FSD miscalculation are split-second events that even professional drivers, operating with a test mindset, must focus on catching.”

Those miscalculations come for everyone, whether they’re fully-trained test drivers or regular people just going about their daily business. And while AMCI was happy to share how many times it was forced to take the wheel, Tesla hasn’t been so forthcoming with the frequency with which actual Tesla owners step in and take control of their Tesla.