+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

The NTSB wants Elon Musk to limit where Tesla's Autopilot can be used

Oct 26, 2021, 00:24 IST
Business Insider
The Tesla Model S. Tesla
  • The National Transportation Safety Board is urging Elon Musk to make Autopilot safer.
  • The letter from the agency's chairwoman comes amid increasing scrutiny of Tesla's technology.
Advertisement

The head of the National Transportation Safety Board sent a fiery letter to Elon Musk urging him to implement safety measures it recommended in the wake of a fatal Autopilot crash more than four years ago.

NTSB Chairwoman Jennifer Homendy pressed Tesla's CEO to respond to two recommendations. The agency wants Tesla to restrict Autopilot use to certain types of highways where it's safest, and to figure out a way to better monitor driver attention.

"Our crash investigations involving your company's vehicles have clearly shown that the potential for misuse requires a system design change to ensure safety," Homendy said. She said she was "deeply concerned" at what she called Tesla's inaction on these issues.

Autopilot, which uses cameras to keep a Tesla in its lane and maintain a steady distance to the vehicle ahead, has come under increased scrutiny after a series of crashes have highlighted design flaws and its potential for abuse.

The National Highway Traffic Safety Administration recently opened an investigation into incidents where Teslas on Autopilot collided with stopped emergency vehicles like ambulances and police cars. Numerous viral videos have appeared to show Tesla drivers asleep or otherwise distracted with Autopilot engaged.

Advertisement

In its investigation of the 2016 crash that killed Joshua Brown when his Model S collided with a turning semi-truck, the NTSB found that Brown had his hands off the wheel for extended periods of time and that he was using Autopilot on roads it wasn't designed for. Autopilot requires full driver attention and does not make cars autonomous.

Following that investigation, the agency sent safety recommendations to Tesla and five other automakers regarding their driver-assistance technologies. Tesla alone did not respond, Homendy said. During a later investigation, Tesla told the agency that it is up to drivers to pay attention and to decide where they want to turn on Autopilot.

Homendy also criticized Tesla for expanding its prototype Full Self-Driving software to more drivers before addressing issues with Autopilot. That more advanced system aims to ferry drivers around city streets while recognizing pedestrians, stop lights, and the like, but it is far from ready.

"If you are serious about putting safety front and center in Tesla vehicle design, I invite you to complete action on the safety recommendations we issued to you four years ago," Homendy said.

You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article