+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

Apple contractors working on Siri 'regularly' hear recordings of sex, drug deals, and private medical information, a new report says

Jul 26, 2019, 23:53 IST

Florence Fu/Tech Insider

Advertisement
  • Apple contractors in charge of quality control for Siri, the company's voice assistant, "regularly" hear private and confidential information, according to a new report.
  • According to The Guardian's Alex Hern, Apple's contractors "regularly hear confidential medical information, drug deals, and recordings of couples having sex."
  • Apple says its contractors hear "a small proportion of Siri recordings" to grade the assistant's responses - even if Siri's activation is accidental.
  • Visit Business Insider's homepage for more stories.

All of the major voice assistants - including Apple's Siri, Amazon's Alexa, and Google's Assistant - are not as private as you might think.

As we wrote back in April, in order to train and improve virtual assistants like Alexa, the companies behind those services tend to have employees or contractors manually review clips of conversations, for the sake of quality control.

A new report from The Guardian's Alex Hern on Friday sheds more light on how Siri actually works. Hern spoke with an anonymous contractor who performs quality control on Siri, who said they were concerned about how often Siri tends to pick up "extremely sensitive personal information."

According to The Guardian's source, contractors working on Siri "regularly" hear recordings of people having sex, business deals, doctors and patients having private medical discussions, and even conducting drug deals with each other.

Advertisement

Whether or not Siri's activation was intentional - and often it's not, as the anonymous contractor said Siri can think the "zip" sound is a trigger - these Apple contractors are responsible for grading Siri's responses. They do note whether the activation was accidental among other factors, like whether or not Siri's answer was appropriate or helpful, or whether the question posed to Siri is something it should be expected to do.

We reached out to Apple about the issue, but we've yet to hear back. The company told The Guardian that "less than 1%" of daily Siri activations are looked at for review, and that no Siri requests are associated with Apple IDs, "under the obligation to adhere to Apple's strict confidentiality requirements." Yet, the whistleblower told The Guardian that Siri recordings "are accompanied by user data showing location, contact details, and app data," which Apple might use to know whether or not a request was acted on.

The report raises significant privacy concerns about the process. According to the anonymous contractor who came forward, Siri's quality-control workers could potentially misuse this private information, since there's "no specific procedures to deal with sensitive recordings."

"There's not much vetting of who works there, and the amount of data that we're free to look through seems quite broad," the contractor told The Guardian. "It wouldn't be difficult to identify the person that you're listening to, especially with accidental triggers - addresses, names, and so on ... It's not like people are being encouraged to have consideration for people's privacy."

According to Apple, the company saves Siri voice recordings for six months at a time; after that, Apple saves a copy of the data without any kind of identifier for up to two years, for the sake of improving Siri's performance and accuracy. Some companies collect more identifying information - like when you use Alexa, for instance - but compared to Amazon and Google, Apple does not let users opt out for how their recordings might be used, unless you turn off Siri entirely.

Advertisement

You can read the full report over at The Guardian.

NOW WATCH: The Navy has its own Area 51 and it's right in the middle of the Bahamas

You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article