Tesla Probe Widens as FSD Complaints Surge
Growing questions around Tesla’s Full Self-Driving (Supervised) software are drawing renewed attention as U.S. regulators uncover more reports of red-light violations and lane-crossing incidents. Many drivers searching for updates on Tesla FSD safety, investigation progress, or potential recalls are now finding that federal officials have logged at least 80 incidents—significantly higher than initial estimates. This rapid increase in data has accelerated scrutiny and raised concerns over whether the driver-assistance system can reliably detect traffic signals, road markings, and hazardous scenarios on public roads.
FSD Safety Complaints Rise as Regulators Expand Review
The National Highway Traffic Safety Administration (NHTSA) confirmed that its latest findings build on a growing list of safety complaints tied to Tesla’s FSD (Supervised) technology. According to a letter sent to Tesla this week, the agency has now collected 62 complaints directly from drivers. Another 14 were supplied by Tesla, alongside four additional reports gathered from media coverage. Each one involves the software allegedly running red lights or steering the vehicle across lane boundaries. This marks a sharp increase from the roughly 50 cases initially cited when the investigation opened in October, signaling that potential issues may be more widespread than believed.
Regulators Question FSD’s Ability to Detect Roads and Signals
NHTSA’s Office of Defects Investigation (ODI) is now analyzing whether the system can accurately detect traffic signals, lane markers, and posted signs during real-world driving. Investigators are also evaluating whether Tesla’s driver-assistance system provides adequate warnings when the vehicle fails to follow road rules. This broader line of inquiry reflects growing concerns among safety experts over whether FSD’s current capabilities match Tesla’s marketing language or drivers’ expectations. The agency has requested detailed responses from Tesla, which are due by January 19, 2026.
Tesla’s Own Data Adds to the Growing List of Incidents
Part of the increased incident count comes from Tesla itself. The automaker’s reporting contributed 14 cases, though the company heavily redacts its submissions before sending them to regulators. This makes it challenging to understand the context or location of many incidents. Still, Tesla’s participation in the data-gathering phase indicates that the company is aware of the concerns around FSD behavior and may be quietly investigating scenarios where the software struggled to respond appropriately. Such incomplete data also raises further questions for drivers who depend on the system daily.
Earlier Reports Highlighted a Single Maryland Intersection
When ODI first announced the investigation in October, many of the documented incidents occurred at one particular intersection in Joppa, Maryland. Tesla stated at the time that it had already “taken action to address the issue,” leading some observers to believe the problem was localized. However, these newly reported cases do not appear to be tied to the same location. Regulators have not disclosed the geographic distribution of the latest incident reports, suggesting the behavior may be happening across a broader range of driving environments.
Why FSD’s Red-Light Behavior Matters for Drivers
Red-light violations are among the most dangerous road-rule breaches. When a driver-assistance system fails to recognize a signal change, the consequences can lead to severe collisions at intersections—areas already known for high accident rates. Lane-crossing incidents add an additional layer of risk, especially on highways where traffic moves at higher speeds. For Tesla owners who rely on FSD (Supervised) to manage parts of their commute, this investigation raises urgent questions about when and where the software can be trusted. Safety organizations argue that understanding these limitations is essential before FSD expands to even more users.
Growing Scrutiny Comes at a Critical Moment for Tesla
This escalation in complaints arrives during a pivotal phase for Tesla’s autonomous-driving ambitions. The company has aggressively promoted FSD (Supervised) as a major differentiator in the EV market, repeatedly highlighting its ability to navigate complex roads with minimal intervention. But as federal scrutiny intensifies, Tesla may face stricter oversight, delayed feature rollouts, or mandated safety updates. Investors watching the company’s software strategy could see the investigation as a significant risk factor for future growth in autonomous vehicle technology.
What This Means for the Future of FSD Updates
Tesla frequently updates FSD through over-the-air software patches designed to improve responsiveness, perception, and driver monitoring. If ODI’s investigation identifies systemic flaws, Tesla may be required to issue broader corrective updates or modify the system’s operating limits. Users who have already noticed inconsistent behavior—such as hesitation at intersections or drifting near lane edges—may soon receive new updates meant to address these issues. Regulators will likely push for stronger safeguards, particularly in situations where the system misinterprets road boundaries or light cycles.
Drivers Continue Reporting Issues Despite Software Fixes
The fact that complaints continue to rise even after Tesla claims to have fixed specific problem areas suggests that the challenges extend beyond isolated intersections. Many drivers report intermittent failures that are difficult to reproduce—an issue that complicates both diagnosis and repair. As FSD becomes more widely used across diverse driving environments, regulators may receive an even broader picture of its real-world performance. This pattern also underscores why federal agencies remain cautious about software that markets itself as “self-driving,” even with supervision requirements.
Will This Lead to a Recall? Too Early to Tell
While some observers believe the growing complaint count could trigger a recall, ODI has not indicated that such action is imminent. Recalls typically occur when regulators determine that a defect poses an unreasonable safety risk, and investigations can take months to reach conclusions. Tesla has previously issued software-based recalls for Autopilot and FSD features, meaning a similar scenario isn’t off the table. For now, the agency is focused on gathering detailed information about system behavior, driver interactions, and Tesla’s internal testing results.
What Tesla Owners Should Watch For Next
Tesla owners following the investigation should look for new communications from the company regarding FSD performance and pending updates. As the January 19 deadline approaches, more information may become public about how Tesla plans to address regulators’ concerns. Drivers who have experienced unusual FSD behavior are encouraged to file reports directly with NHTSA, which uses these submissions to guide the scope of its investigations. With complaints rising and federal pressure intensifying, the coming months will determine whether Tesla can demonstrate that FSD is ready for broader deployment—or whether deeper fixes are still needed.
إرسال تعليق