+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

Tesla's Autopilot used to be a huge advantage for the company- but now it's becoming a problem

May 19, 2018, 18:44 IST

Beck Diefenbach / Reuters

Advertisement
  • Tesla's Autopilot is a semi-autonomous driver assistance feature that, among other things, can keep a car in its lane and adjust its speed based on surrounding traffic.
  • Two fatal accidents involving Autopilot have illustrated the concerns some auto companies have about semi-autonomous driver assistance systems and whether enough drivers use them correctly.
  • Tesla has repeatedly said Autopilot is meant to be used with an attentive driver whose hands are on the wheel.
  • But analysts have pointed to decisions the company has made when promoting the feature that, in aggregate, have the potential to create false impressions about Autopilot.


A feature that was supposed to set Tesla apart from its competition has been doing so for the wrong reasons lately.

Tesla's Autopilot system is a semi-autonomous driver assistance feature that, among other things, can keep a car in its lane and adjust its speed based on surrounding traffic. (Tesla considers safety features that were made possible by the Autopilot hardware, like automatic emergency braking, to be part of the Autopilot suite, but those safety features come standard in Tesla vehicles and can activate even if a customer does not buy or use the "Enhanced Autopilot" package, which includes the semi-autonomous features. Concerns about Autopilot have focused on the semi-autonomous features, like Autosteer)

Since its launch in 2015, Autopilot has been one of Elon Musk's main talking points and has helped the electric carmaker garner a reputation as a leader in automotive tech. But recently, Autopilot has put Tesla in the spotlight for all the wrong reasons.

Two fatal accidents have raised questions

KTVU via Associated Press

Advertisement

Since Autopilot was released, two fatal accidents have illustrated the concerns some auto companies have about semi-autonomous driver assistance systems and whether enough drivers use them correctly.

The first came in 2016, when Joshua Brown was killed after his Model S crashed into a semi truck in Florida while Autopilot was engaged. The vehicle's data logs revealed that Brown had his hands on the wheel for 25 seconds of the 37 minutes Autopilot was activated during the trip, and that Brown had received seven visual and six auditory warnings to return his hands to the wheel. While the National Highway Traffic Safety Administration (NHTSA) said Autopilot was not responsible for the accident and did not identify any defects in the feature, the National Transportation Safety Board (NTSB) said the feature played a role in the crash.

"Tesla allowed the driver to use the system outside the environment for which it was designed, and the system gave far too much leeway to the driver to divert his attention to something other than driving," NTSB chairman Robert Sumwalt said in 2017.

Two months after the accident, Tesla rolled out a feature that disables Autopilot once multiple warnings to apply sufficient torque to the wheel are ignored in a software update.

The second fatal accident occurred in March, when a Model X crashed into a highway barrier in California. The driver, Wei Huang, was taken to a hospital and later died. The NTSB has yet to complete its investigation into the accident, but the agency confirmed that Autopilot was engaged during the collision. Tesla said Huang had received multiple warnings to put his hands on the wheel during the drive and indicated that a shortened impact attenuator increased the damage to Huang and the vehicle.

Advertisement

The most recent accident involving Autopilot came on May 11, when a Model S hit a fire department vehicle in Utah. The driver had Autopilot engaged at the time of the accident, but also said she was using her phone. She suffered a broken ankle as a result of the incident.

In a report Tesla provided to the South Jordan Police Department, the company said the driver took her hands off the wheel over 12 times during the drive, including for the 80 seconds before the collision.

While the company has repeatedly said Autopilot is meant to be used with an attentive driver whose hands are on the wheel, analysts have pointed to decisions the company has made when promoting the feature that, in aggregate, have the potential to create false impressions about Autopilot.

Autopilot's name could create misconceptions

Cadillac

If Autopilot is misused, it could result in accidents that would be avoided by an attentive driver.

Advertisement

This means Tesla has to strike a difficult balance when selling the feature, between highlighting Autopilot's benefits and outlining its limitations. But its name could be getting in the way of that.

The word "autopilot" is typically associated with systems that steer airplanes, boats, or spacecraft without human assistance. In contrast, the names given to competing driver assistance systems, like Super Cruise or Nissan's Pro Pilot Assist, more clearly create the impression that they're augmenting, rather than replacing, the driver, Autotrader executive editor Brian Moody told Business Insider

"Even if they work similarly and even if the admonitions are the same, the name itself I think is the first place that you start in looking at this," Moody said. "The difference is that one name implies it does it by itself and the other one implies that you need to be there as a person because it's giving you an assist."

A Tesla representative refuted that conclusion, citing a 2016 study by a German marketing company that surveyed 675 Tesla owners. The survey found that 98% of respondents answered "Yes" when asked if they understood that when using Autopilot, they were expected to maintain control of the vehicle at all times.

Tesla's description of Autopilot could confuse some customers

Tesla

Advertisement

Tesla's website may also create confusion for some consumers. When you order a Model S sedan or Model X SUV, Tesla uses the following language to describe Autopilot's capabilities:

"Your Tesla will match speed to traffic conditions, keep within a lane, automatically change lanes without requiring driver input, transition from one freeway to another, exit the freeway when your destination is near, self-park when near a parking spot and be summoned to and from your garage. That said, Enhanced Autopilot should still be considered a driver's assistance feature with the driver responsible for remaining in control of the car at all times."

In the following paragraph, the company writes, "Tesla's Enhanced Autopilot software has begun rolling out and features will continue to be introduced as validation is completed, subject to regulatory approval."

A customer could read that description and believe Autopilot can control a significant portion of highway driving. But some of the listed features - like the ability to change lanes without driver input, transition from one freeway to another, and exit the freeway when a destination is near - are not included in Autopilot's current iteration. Instead, they're among that features Tesla plans to introduce in the future.

A Tesla representative told Business Insider that its customers have not expressed confusion about the language on its website.

Advertisement

Beneath the box that describes Autopilot's capabilities is one that gives customers the ability to purchase what Tesla calls "Full Self-Driving Capability."

The company makes clear that it is only referring to hardware that will require a future software update to achieve full autonomy, but the fact that the company offers the hardware at all could lead customers to believe its vehicles are closer to the full autonomy than they may be and, by extension, ascribe more functionality to Autopilot than it currently has, Gene Munster, a managing partner at the venture capital firm Loup Ventures, told Business Insider.

"It's as if it's almost ready to go, and that, I think, builds a little bit of false confidence in the current product," he said.

YouTube / CBS This Morning

The tension within the company's approach to selling Autopilot is most clear when its employees have demonstrated the feature.

Advertisement

Munster said he's seen Tesla employees take their hands off the wheel during test drives, and Musk himself did so on multiple occasions during an appearance on CBS This Morning that aired in April.

Early in the segment, Musk takes his hands off the wheel to point out how the vehicle alerts the driver when it senses no hands on the wheel, but Musk leaves his hands off the wheel after that, and can be seen with his hands off the wheel later in the segment.

"There's the guy, the owner of the company, on TV driving the car with no hands on the wheel. The two messages don't go together," Moody said.

Tesla's favorite safety stats may not tell the whole story

TeslaThe NHTSA refuted Tesla's claim that a 2017 report from the agency proves Autopilot reduced crash rates.

As a pioneer of advanced driver assistance systems, Tesla is more likely to receive attention for an accident involving Autopilot than its competitors would be for their semi-autonomous systems. Even if the feature works as planned, a few high-profile incidents have the potential to create concerns about safety.

Advertisement

During Tesla's first-quarter earnings call in May, CEO Elon Musk expressed frustration about the media attention Autopilot has received after high-profile accidents and said that it makes drivers less safe.

"It's really incredibly irresponsible of any journalists with integrity to write an article that would lead people to believe that autonomy is less safe. Because people might actually turn it off, and then die," he said.

While Tesla has made clear that Autopilot is not a substitute for an attentive driver, the company has said that the feature makes its vehicles safer. The company often cites two statistics to support that point, but the data may not tell the whole story.

The first comes from a 2017 NHTSA report that examined crash rates before and after Autopilot became available, which Tesla has used to claim Autopilot reduced crash rates by as much as 40%. The report distinguishes between Tesla's automatic emergency braking features and Autopilot's semi-autonomous driver assistance features and considers Autopilot to consist of Autosteer, which keeps a car in its lane, and traffic-based speed adjustment.

In May, NHTSA told Ars Technica that the study Tesla cited did not necessarily prove Autopilot reduced accident rates. The agency said that, while it compared accident rates before and after Autopilot's Autosteer became available, it didn't evaluate its ability to prevent accidents.

Advertisement

Tesla declined to comment on the NHTSA's statement.

Tesla has also said that vehicles equipped with Autopilot hardware are 3.7 times less likely to be involved in a fatal accident than other vehicles, implying that Autopilot is a decisive factor in that difference. But it's unclear if the company was able to separate incidents in which Autopilot was engaged from those in which it wasn't.

Tesla did not respond when asked about the origins of the data which led to that statistic, but Musk said on Twitter Monday that Tesla vehicles were four times better than average based on automotive fatality data from NHTSA for 2017.

Moody, though, said the statistics don't provide a complete picture of Autopilot's effect on vehicle safety.

"This is an example of how facts don't really tell the whole story," Moody said. "It is too early to say whether automated features, in general, truly have an impact on passenger safety."

Advertisement

The best approach to promoting Autopilot may be a conservative one

Some of Tesla's competitors, like General Motors, Nissan, and Daimler, have also introduced driver assistance features with similar capabilities, but others are nervous about including semi-autonomous systems in their vehicles because they fear drivers will place too much trust in them and fail to pay attention to the road.

In Tesla's case, that trust comes from the company's success in developing effective, semi-autonomous features.

"The technology is so good, it's natural that people jump to conclusions or trust it more than they should," Munster said.

To combat this concern, Cadillac's Super Cruise feature, which was introduced in 2017, can track a driver's eyes to determine if the driver is alert. If the system senses that the driver is not engaged, it will send the driver multiple warnings and, if necessary, pull the car over to the side of the road.

Autopilot will also warn the driver - and eventually deactivate until the car is parked - if it doesn't detect a sufficient amount of torque on the wheel, but multiple Tesla owners have demonstrated ways to trick the sensors by placing objects, like an orange, between the wheel's upper and middle sections.

Advertisement

Tesla has no trouble selling its cars. They're stylish, win awards, have consistently high customer satisfaction ratings, and are unveiled at events that no other automaker can match. But with debt concerns, a straining relationship with Wall Street analysts, and a history of failing to meet production goals for its mass-market Model 3 sedan, Tesla doesn't need another problem on its plate.

The best approach to promoting Autopilot, then, may be a conservative one, Moody said.

"There may be wisdom in sometimes taking a cautious approach."

NOW WATCH: How socially responsible investing can help you avoid catastrophic drops within your portfolio

You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article