A recent study conducted by AMCI Testing has raised serious concerns about the safety and reliability of Tesla’s Full Self-Driving (FSD) mode. After driving a Tesla equipped with FSD for over 1,000 miles, researchers concluded that the system’s capabilities were far from perfect.
The testers encountered numerous instances where the Full Self-Driving mode exhibited dangerous and unpredictable behavior. For example, the system ran a red light during a night drive in the city, seemingly following the lead of other vehicles that had also disregarded the traffic signal. Moreover, FSD failed to recognize or follow a double yellow line around a curve, veering into oncoming traffic.
The study revealed that human intervention was required over 75 times during the 1,000-mile test, averaging once every 13 miles. This alarmingly high frequency of human intervention suggests that FSD is not yet ready for fully autonomous operation.
While FSD demonstrated impressive capabilities in certain situations, such as navigating tight spaces, its shortcomings in critical safety areas are a major cause for concern. The study’s findings contradict Tesla CEO Elon Musk’s ambitious plans for launching a driverless robotaxi service.
It is important to note that this study focused solely on Tesla’s FSD and did not compare its performance to competing autonomous driving systems like Waymo. However, even with a human driver present, FSD poses significant risks due to its tendency to instill a false sense of security and complacency.
Reference- Interesting Engineering, Tesla website, AMCI Testing report, Futurism