The family of Genesis Giovanni Mendoza-Martinez, who tragically died in February 2023 after his Tesla Model S collided with a firetruck in San Francisco, is suing Tesla and its CEO, Elon Musk. The lawsuit accuses Musk of making deceptive statements about Tesla’s Autopilot and Full Self-Driving (FSD) capabilities, creating a false impression of the vehicles’ self-driving abilities.
Mendoza-Martinez, who relied solely on Autopilot for 12 minutes before the crash, bought the car believing it could drive safely without human input, a perception his family claims Tesla and Musk intentionally promoted.
The lawsuit highlights Musk’s online statements, alleging they exaggerated the capabilities of Tesla’s driver-assistance software, despite known limitations.
Tesla, in response, has placed the blame on Mendoza-Martinez, citing his negligence as the primary cause of the crash. However, the incident has added to the scrutiny surrounding Tesla’s Autopilot and FSD software.
The National Highway Traffic Safety Administration (NHTSA) has been investigating Tesla’s Autopilot system since 2021. Mendoza-Martinez’s crash is among several cases under review. The NHTSA found that Tesla’s FSD system often makes drivers overly reliant, causing them to neglect driving tasks.
There are at least 15 ongoing cases involving Tesla’s Autopilot or FSD, many resulting in severe injuries or fatalities. Critics argue that Tesla’s software naming conventions, such as “Full Self-Driving,” contribute to a dangerous misunderstanding of its limitations.
Reference- NBC, California Department of Motor Vehicles, National Highway Traffic Safety Administration, The Independent report