An MIT study found that Tesla Autopilot users were better than expected at taking over before potentially dangerous situations
- A study from Massachusetts Institute of Technology (MIT) researchers found that participants were better than expected at taking control from Tesla's Autopilot system before potentially dangerous situations.
- "In our data, drivers use Autopilot frequently and remain functionally vigilant in their use," the authors wrote. "These results are surprising as they do not align with the prediction of prior literature on human monitoring of automation."
- But the study's authors emphasize that their findings cannot be extended to all Autopilot users or to other semi-autonomous systems.
- Visit Business Insider's homepage for more stories.
A study from Massachusetts Institute of Technology (MIT) researchers found that participants were better than expected at taking control from Tesla's Autopilot system before potentially dangerous situations, adding a new layer to the debate about semi-autonomous-driving systems.
The introduction of semi-autonomous, driver-assistance systems that can control steering, braking, and acceleration in some circumstances, but require that the driver be ready to take over, has raised questions about how well people can pay attention to the road as their cars handle an increasing percentage of driving tasks.
Much of this debate has centered around Autopilot, which was a pioneer among semi-autonomous systems when it was introduced in 2015. Tesla has presented data that indicates drivers get in fewer accidents when Autopilot is activated, but the data doesn't provide enough detail to isolate the effect of Autopilot from other factors, like the fact that fewer accidents occur on highways, where it appears Autopilot is most often used, than on residential streets. Fatal accidents involving Autopilot highlight its limitations, but are too anecdotal to draw broad conclusions.
Drivers were more attentive than expected
The MIT study, titled, "Human Side of Tesla Autopilot: Exploration of Functional Vigilance in Real-World Human-Machine Collaboration," sought to observe how well Autopilot users were able to remain aware of their surroundings and take over from the system when necessary. The researchers looked at 8,729 instances in which Autopilot was deactivated, either by the driver or the system itself, before encountering a potentially dangerous situation, and how quickly the driver reacted to it. (Each of the 21 vehicles included in the study had cameras filming the driver, the vehicle's interior, and the area in front of the vehicle.)
If the driver took more than one second to react to the potentially dangerous situation, the researchers designated the response as being late. Drivers responded late in none of the 8,729 instances in which Autopilot was deactivated before a potentially dangerous situation, a result that surprised the study's authors.
"In our data, drivers use Autopilot frequently and remain functionally vigilant in their use," the authors wrote. "These results are surprising as they do not align with the prediction of prior literature on human monitoring of automation."
The authors give two possible explanation for the drivers' ability to react quickly: They knew Autopilot was imperfect and used it in situations it was not designed for, both of which made them extra attentive.
The study had a limited scope
But the study's results do not prove that Autopilot users, as a whole, are attentive when using the system. The authors emphasize that their findings cannot be extended to all Autopilot users or to other semi-autonomous systems, in part because of its relatively small sample size, the subjective process used to evaluate driver reactions, and the fact that the study did not record instances in which drivers faced potentially dangerous situations that did not lead to an Autopilot deactivation.
While the study does not settle questions about Autopilot's safety, the results are encouraging for Tesla, and they illustrate that there is still much to learn about the relationship between drivers and semi-autonomous technology.
Have you worked for Tesla? Do you have a story to share? Contact this reporter at mmatousek@businessinsider.com.
- Read more:
- Elon Musk loves to make grandiose promises. Here are 8 he failed to deliver on.
- Life, death, and spontaneous combustion - here's why the debate about Tesla fires just got more fierce
- Elon Musk teased an insurance foray during Tesla's earnings
- Lyft's COO, a former Tesla exec, is the latest expert to throw cold water on Elon Musk's plan to have a million robo-taxis on the road by 2020