Florence Fu/Tech Insider
- Apple issued an apology on Wednesday after The Guardian reported that contractors tasked with evaluating Siri's performance regularly hear recordings of private conversations.
- The company is making a few changes to how it grades Siri's accuracy, which entails not recording Siri conversations by default, among other new policies.
- Apple suspended its grading program following the report, but plans to reinstate it this fall after putting the changes into effect.
- Visit Business Insider's homepage for more stories.
Apple issued a rare apology on Wednesday over the processes it uses to evaluate Siri's performance, following a report from The Guardian which said company contractors regularly hear recordings of private conversations. Apple is also making some changes to the way it grades Siri interactions.
"As a result of our review, we realize we haven't been fully living up to our high ideals, and for that we apologize," the company says.
Apple announced earlier this month that it would halt its Siri grading program following the report. On Wednesday, it said it plans to resume the program in the fall after making several changes and releasing software updates. "Grading" is the term Apple uses to describe its process for evaluating Siri's performance, which includes having contractors review recordings of Siri interactions for accuracy.
As part of its upcoming changes, Apple says it will no longer record Siri conversations by default. Instead, it will use computer-generated transcripts to monitor Siri's accuracy. But users will be able to opt in to having their recordings shared with Apple to improve Siri's performance.
"We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place," the company wrote.
If and when customers to opt in, recordings will only be shared with Apple employees, which implies contractors will not have access to such data. Apple also says it will delete recordings that were the result of accidentally summoning Siri.
The Guardian reported in July that Apple contractors would frequently hear sensitive personal information, such as conversations between doctors and patients, business deals, and sexual interactions. In many of those situations, unintentional Siri triggers were to blame, according to the report.
As a privacy measure, Apple doesn't link a user's Siri requests to their Apple ID account when it's being processed. Rather, it uses a random identifier, which consists of a string of letters and numbers - to track data that's being processed.
Apple isn't the only technology company that raised privacy concerns in recent months over its handling of data gathered from voice assistant interactions. Bloomberg reported back in April that Amazon contractors were listening to Alexa conversations for evaluation purposes. Similar reports about Googleand Microsoft have emerged in recent weeks as well.