U.S. regulator investigating Tesla's self-driving software after fatal crash
CBC
The U.S. National Highway Traffic Safety Administration on Friday said it was opening an investigation into 2.4 million Tesla vehicles with the automaker's Full Self-Driving software after four reported collisions, including a fatal crash.
The U.S. auto safety regulator said it was opening the preliminary evaluation after four reports of crashes where FSD was engaged during reduced roadway visibility like sun glare, fog, or airborne dust.
In one crash "the Tesla vehicle fatally struck a pedestrian. One additional crash in these conditions involved a reported injury," NHTSA said.
The probe covers 2016-2024 Model S and X vehicles with the optional system as well as 2017-2024 Model 3, 2020-2024 Model Y, and 2023-2024 Cybertruck vehicles.
The preliminary evaluation is the first step before the agency could seek to demand a recall of the vehicles if it believes they pose an unreasonable risk to safety.
Tesla says on its website its "Full Self-Driving" software in on-road vehicles requires active driver supervision and does not make vehicles autonomous.
NHTSA is reviewing the ability of FSD's engineering controls to "detect and respond appropriately to reduced roadway visibility conditions."
The agency is asking if other similar FSD crashes have occurred in reduced roadway visibility conditions, and if Tesla has updated or modified the FSD system in a way that may affect it in reduced roadway visibility conditions.
NHTSA said the "review will assess the timing, purpose, and capabilities of any such updates, as well as Tesla's assessment of their safety impact," the agency said.
Tesla CEO Elon Musk is seeking to shift Tesla's focus to self-driving technology and robotaxis amid competition and weak demand in its auto business.
The company did not immediately respond to requests for comment. Its shares were down 0.5 per cent before the bell.
Last week, Musk unveiled Tesla's two-seater, two-door "Cybercab" robotaxi concept without a steering wheel and pedals that would use cameras and artificial intelligence to help navigate roads. Tesla would need NHTSA approval to deploy a vehicle without human controls.
Tesla's FSD technology has been in development for years and aims for high automation, where its vehicle can handle most driving tasks without human intervention.
But it has faced legal scrutiny with at least two fatal accidents involving the technology, including an incident in April in which a Tesla Model S car was in Full Self-Driving mode when it hit and killed a 28-year-old motorcyclist in the Seattle area.