2024-06-29
Tesla's full self-driving technology raising safety concerns
Tesla's Full Self-Driving Technology Fails to Detect Moving Train, Raising Safety Concerns
In a concerning incident that highlights the potential dangers of emerging autonomous driving technologies, a Tesla vehicle operating in Full Self-Driving (FSD) mode failed to detect a moving train, nearly resulting in a collision. The incident, which occurred in Ohio under foggy conditions, has sparked renewed debates about the safety and reliability of Tesla's advanced driver assistance systems.
Craig Doty II, a certified general appraiser from Ohio and the owner of the Tesla, narrowly avoided a catastrophic accident when he was forced to take manual control of his vehicle at the last second. Doty, who was driving at approximately 60 miles per hour in a 55 mph zone, reported that the car's FSD system failed to slow down or stop as it approached a train crossing with visible flashing red lights and moving boxcars.
"I was like there's no way it doesn't see the train," Doty recounted. "There's no way it doesn't see the flashing lights. Yes, it was foggy, but you can still see the lights."
The incident raises serious questions about the capabilities and limitations of Tesla's Full Self-Driving technology, which the company sells as a premium driver assistance option for $8,000 upfront or $99 per month. Despite parts of the technology still being in "beta" mode, Tesla has marketed FSD as a crucial part of the company's future.
Tesla's website states that vehicles equipped with FSD will be able to drive themselves "almost anywhere with minimal driver intervention" and will "continuously improve." However, the company also emphasizes that the currently enabled Autopilot and Full Self-Driving features require active driver supervision and do not make the vehicle autonomous.
Doty, who had owned his Tesla for about a year and had driven approximately 20,000 miles with FSD activated, admitted to becoming complacent with the system's abilities. "You do get complacent that it knows what it's doing," he said. "And usually it's more cautious than I would be as a driver."
The National Highway Traffic Safety Administration (NHTSA) has stated that it is aware of the incident and is gathering more information from Tesla. This event adds to the growing list of concerns surrounding Tesla's autonomous driving technologies, including previous accidents and fatalities associated with the Autopilot system.
In the aftermath of the incident, Doty received a citation for "failure to control" the vehicle, which carries a $175 fine. During his court hearing, he pleaded no contest and requested leniency, citing the car's FSD mode as a mitigating factor. The judge agreed to strike the citation if Doty can prove by July that the damages to the rail will be fixed and paid for, either by himself or his insurance.
This near-miss serves as a stark reminder of the current limitations of autonomous driving technologies and the importance of driver vigilance, even when using advanced driver assistance systems. It also underscores the need for clearer regulations and safety standards for self-driving technologies as they continue to evolve and become more prevalent on public roads.
As the debate over the safety and reliability of autonomous driving technologies continues, incidents like this one are likely to fuel further scrutiny of Tesla's FSD system and similar technologies from other manufacturers. The challenge for regulators and automakers alike will be to balance the potential benefits of these advanced systems with the paramount need for public safety on our roads.
For now, the message to drivers is clear: regardless of how advanced a vehicle's autonomous capabilities may be, the responsibility for safe operation ultimately rests with the person behind the wheel. As Craig Doty's experience demonstrates, maintaining alertness and readiness to take control at a moment's notice remains crucial, even when using the most cutting-edge driver assistance technologies.
Share with friends:
Write and read comments can only authorized users
Last news