Tesla ordered by German court to reimburse a woman $101,000 because her Autopilot was unreliable and kept making the car brake, report says
- A Munich court ordered Tesla to reimburse a woman 99,416 Euros ($101,000), court documents show.
- The woman said her Model X's Autopilot feature failed to recognize hazards and braked unexpectedly.
A court in Munich, Germany has ordered Tesla to pay a woman over $101,000 after she complained her Tesla had problems with its Autopilot feature, as first reported by Der Spiegel.
The woman who brought the case said she signed a contract to buy the car in December 2016 for 112,640 Euros ($114,500). The car was then delivered in March 2017, per a copy of the judgement seen by Insider. The owner paid an extra 5,500 Euros ($5,58o) for the Autopilot feature, per the judgement.
The Tesla owner said it started to have repeated problems with the car's Autopilot function as far back as November 2017.
The court upheld the woman's complaint that the car's Autopilot was defective. The judgement said the Autopilot been unreliable identifying obstacles and that the brakes could suddenly activate for no apparent reason. The court ruled the sudden braking posed a "massive danger" in city traffic.
Tesla's lawyer argued the Autopilot function is not intended for use in city traffic, per Der Spiegel, but in its ruling the court did not accept this argument.
The judgement said that manually toggling Autopilot on and off for different kinds of traffic could be distracting for the driver.
Tesla does not list city traffic as a limitation on autopilot's functionality on its support page.
The judgement also upheld complaints from the woman about the car not related to Autopilot, for example that its doors didn't open and shut properly.
The court ordered Tesla to pay the woman 99,420 Euros ($101,000) plus 5% interest. It also ruled Tesla must pay 80% of the legal fees, while the woman must pay 20%.
Tesla did not immediately respond when contacted by Insider about the case.
The case sets an uncomfortable precedent for Tesla, which is under close regulatory scrutiny over its Autopilot feature.
Autopilot is a driving assistance feature which allows the car to drive and brake automatically. Tesla's website says Autopilot does not make vehicles fully autonomous and requires "active driver supervision."
The National Highway Traffic Safety Administration (NHTSA) is also conducting a broader probe into the effectiveness and safety of Autopilot which it launched after finding 11 instances of Tesla cars crashing into first responder vehicles while using Autopilot.
The US NHTSA launched an investigation into so-called "phantom braking" — when Teslas on Autopilot slam on the brakes for no apparent reason — in February. It told Tesla in May it had received complaints about phantom braking from over 750 Tesla drivers.
Tesla CEO Elon Musk has lauded the company's self-driving technology, and said in December 2021 it is unfairly criticized.
"I think it's one of those things where you're not going to get rewarded necessarily for the lives that you save, but you will definitely be blamed for lives that you don't save," Musk told TIME.
Musk has repeatedly promised fully autonomous cars would arrive in the near future, but the company has yet to realise this goal.