- The NHTSA sent
Tesla 2 letters demanding information about its Autopilot and Full Self Driving software. - The safety regulator expressed concern about the software's use on the road after reported accidents.
Top US safety regulators are questioning why Tesla did not issue a recall when it recently updated Autopilot software to make it better recognize stopped emergency vehicles.
In two letters sent by the National Highway Traffic Safety Administration to Tesla, the regulator is seeking additional information about the Autopilot update and the company's release of Full Self-Driving Beta, expressing concerns about the software's use on the road. One of the letters also includes a request that Tesla provides information about non-disclosure agreements between Tesla and its vehicle owners.
Teslas have had difficulty identifying emergency vehicles with flashing lights, flares, illuminated arrow boards, or traffic cones near them while in Autopilot. Several incidents involving Autopilot-enabled Teslas crashing into stopped emergency vehicles have prompted a NHTSA investigation. Shortly after that probe was announced, Tesla updated Autopilot in September with the aim of addressing these issues.
When switched on, Autopilot keeps a Tesla centered in its lane and watches the car ahead to keep a steady distance. Autopilot does not make cars drive themselves and requires full driver attention.
In its letters to the company, NHTSA told Tesla that the company must recall vehicles if a software update fixes a safety defect.
"Through these actions, NHTSA continues to demonstrate its commitment to safety and its ongoing efforts to collect information necessary for the agency to fulfill its role in keeping everyone safe on the roadways, even as technology evolves," NHTSA said in a statement to Insider. "NHTSA's enforcement and defect authority is broad, and we will act when we detect an unreasonable risk to public safety."
Tesla did not immediately respond to Insider's request for comment.
The letters are the latest in an ongoing battle between Tesla and NHTSA. The regulator, which is responsible for enforcing vehicle performance standards, is now investigating 12 accidents that involve Tesla's Autopilot and stopped emergency vehicles. According to the agency, the initial 11 crashes under investigation led to 17 injuries and one death.
"As Tesla is aware, the Safety Act imposes an obligation on manufacturers of motor vehicles and motor vehicle equipment to initiate a recall by notifying NHTSA when they determine vehicles or equipment they produced contain defects related to motor vehicle safety or do not comply with an applicable motor vehicle safety standard," one of the letters says.
The launch of Tesla's FSD Beta was delayed in September due to "last minute concerns," according to a tweet from CEO Elon Musk. NHTSA expressed concern about its use on roads and the inability of Tesla drivers to report any issues because of "non-disclosure agreements that allegedly limit the participants from sharing information about FSD that portrays the feature negatively, or from speaking with certain people about FSD," according to a letter.