Florence Fu/Tech Insider
- Over the past several months, we've seen a slew of reports about how audio recordings captured by voice assistants like Amazon's Alexa, Apple's Siri, Google Assistant, and Microsoft's Cortana were sent off to human contractors for further evaluation.
- In the case of Siri, for instance, contractors "regularly" heard recordings of people having sex, business deals, and private doctor-patient discussions, according to a July report.
- Many of these companies have since suspended or halted these manual-review practices entirely, but it's important to know if you own any gadgets with microphones in them.
- Visit Business Insider's homepage for more stories.
If you own a device with a microphone in it, chances are that audio snippets were recorded - with or without your knowledge - and sent off to other human beings for examination.
This year, we've seen a handful of reports all saying the same thing: The biggest tech companies in the world still need humans to evaluate the accuracy of their AI assistants, like Amazon's Alexa, Apple's Siri, Google Assistant, and Microsoft's Cortana, which still have issues recognizing speech. The way it works is those humans - often contractors, not full-time employees of these tech conglomerates - are responsible for quality control. They grade responses from voice assistants to see if they were actually helpful.
These tech companies often go to extreme lengths to ensure privacy and confidentiality, but contractors who wish to remain anonymous have said that it's not hard to identify who's talking when audio recordings often include names and addresses.
Since reports on some of these reports came out, many of these tech companies have decided to either suspend their voice-analysis practices, or halt them entirely.
Here's what we know about each of the tech companies, and how they currently handle your audio.