A Tesla in 'full-self-driving' mode caused an 8-car crash in California and injured 9 people, report says
- Tesla's unexpected "phantom braking" problem caused an eight-car pileup, according to CNN.
- NHTSA is already investigating the phenomenon known as "phantom braking."
Tesla's "phantom braking" problem, already under federal investigation, caused an eight-car pileup in the San Francisco Bay Area last month, according to a report from CNN.
Nine people were treated for injuries following the crash, including one juvenile who was hospitalized, according to a report detailing the crash that CNN reports it obtained via a records request.
The driver of the vehicle that caused the crash told authorities his Tesla was in "full-self-driving" mode at the time of the accident, CNN reports.
In the report, the California Highway Patrol said it could not confirm if the "full-self-driving" software was active at the time of the crash, per CNN.
The crash occurred on Thanksgiving, hours after Tesla CEO Elon Musk announced on Twitter that the car company's "Full Self-Driving Beta" would be available to all Tesla drivers in North America who have purchased the option.
The US government stepped up an investigation into Tesla's self-driving software this summer after the National Highway Traffic Safety Administration said it had received 758 reports from Tesla owners who say their vehicles have stopped for unknown reasons. This dangerous phenomenon has been dubbed "phantom braking."
In previous complaints to NHTSA, Tesla owners have recounted incidents in which their vehicles unexpectedly and violently brake at highway speeds. These incidents tend to happen when the driver has switched on their Tesla's Autopilot system, which automatically accelerates, brakes, and steers on highways.
Tesla's "Full Self-Driving Beta," released on Thanksgiving, takes that automated driving system one step further. Instead of being limited to highway driving, Tesla owners with Autopilot can hand over control of their vehicle to the "self-driving" software on surface roads, and the Tesla is supposed to complete full trips on its own.
The new software appears to still have some bugs to work through. Earlier this week, a YouTuber posted a video of their 23-minute drive to work using the Full Self-Driving beta highlighting how stressful the software can be to use.
Ultimately, the YouTuber said he's comfortable using FSD on the highway, but "not much else."