FSD

Tesla’s FSD: A Far Cry from Full Self-Driving

Elon Musk has long promised that Tesla’s Full Self-Driving (FSD) software would revolutionize driving safety. However, as Rolling Stone’s Miles Klee discovered, the reality is far from this ambitious vision.

Klee’s test drive in a Tesla Model 3 equipped with FSD revealed significant flaws in the system’s capabilities. The software often struggled to accurately perceive the world around it, misidentifying objects and even failing to detect pedestrians. This is partly due to Tesla’s reliance on cameras instead of LiDAR, a technology used by many competitors.

The system’s limitations are particularly evident in challenging conditions like poor weather or direct sunlight. In such situations, the software may issue warnings, urging drivers to take over.

Regulatory scrutiny of FSD has also intensified. The National Highway Traffic Safety Administration (NHTSA) is investigating numerous accidents involving Tesla vehicles using the software. The regulator has concluded that drivers using FSD may develop a false sense of security and become less engaged in the driving task.

NHTSA

Klee’s test ride highlighted the dangers of relying on the system. The vehicle nearly collided with several objects and even ran a stop sign. In a previous staged test, the FSD system failed to recognize a school bus and collided with a child mannequin.

FSD

These incidents raise serious questions about the safety of FSD and the reliability of Tesla’s claims. While the company continues to push for autonomous driving, it is clear that there is still a long way to go before FSD can truly be considered a reliable and safe technology.

Reference- Dawn Project, Rolling Stone’s Miles Klee self driving account, Futurism, The Verge, Tesla website