TikTok moderators say they were shown sexually explicit images and videos of kids as part of training, report says
- Content moderators hired to review videos for TikTok said they were shown sexually explicit material of children, they told Forbes.
- The report cited moderators who worked for Teleperformance, a third-party company.
Content moderators hired to review videos for TikTok said they were shown sexually explicit material of children as part of their training materials, according to a report published Thursday by Forbes.
The report cited moderators who worked for Teleperformance, a third-party company that the platform contracts to moderate content on its platform, according to the report.
One former employee, identified in the report as Nasser, told Forbes he worked on a team that helped TikTok's artificial intelligence identify content that wasn't allowed on the platform. Nasser alleged he and his colleagues were shown sexually explicit images and videos of children as part of their training at the company..
As TikTok has grown in popularity in recent years it has outsourced much of its content moderation to outside companies that hire staff to manually review videos that may violate the platforms' policies on graphic content. Much like with Facebook before it, many of these content moderators have reported experiencing significant mental trauma and alleged that TikTok failed to adequately equip them for the job.
Another former Teleperformance employee, Whitney Turner, told Forbes she worked at the company for more than a year and left in 2021. Turner said she had access to a spreadsheet called the "Daily Required Reading," or the "DRR," which was accessible by hundreds of Teleperformance and TikTok employees, according to the report. This spreadsheet had examples of content prohibited by TikTok and included images of naked and abused children, she told the outlet.
"I was moderating and thinking: This is someone's son. This is someone's daughter. And these parents don't know that we have this picture, this video, this trauma, this crime saved," Turner told Forbes. "If parents knew that, I'm pretty sure they would burn TikTok down."
Turner said that she turned the document into the FBI and met with an agent in June, according to the report.
A TikTok spokesperson told Insider on Thursday its training materials provide textual descriptions but not visual examples of child sexual abuse material (CSAM) but added that it worked with external companies that could have unique practices.
"Content of this nature is abhorrent and has no place on or off our platform, and we aim to minimize moderators' exposure in line with industry best practices," the spokesperson said. TikTok's training materials have strict access controls and do not include visual examples of CSAM, and our specialized child safety team investigates and makes reports to NCMEC."
The NCMEC is the National Center for Missing & Exploited Children, a non-profit organization founded in the 1980s that serves as the "national clearinghouse and resource center for information about missing and exploited children," according to the organization. Companies like TikTok are legally required to report CSAM found on their platforms to the NCMEC.
Representatives for Teleperformance did not return Insider's request for comment sent Thursday. Akash Pugalia, the global president of trust & safety at Teleperformance, told Forbes it did not use explicit videos of child abuse in its training and did not have those materials in its "calibration tools."
TikTok's moderators allege PTSD and take legal action
Employees of other contractors that TikTok uses to moderate its content have also sounded the alarm on disturbing content they were forced to view for work. Several current or former moderators, who spoke to Insider in an article published in June, detailed a culture of overwork and surveillance while employees at the Nevada office of Telus International.
Two people who spoke with Insider said they had been diagnosed with PTSD since working as a TikTok moderator. Candie Frazier, a moderator who spoke to Insider, sued the company and said she was forced to view beheadings, accidents, suicides, and child abuse materials, Insider previously reported.
"Moderating for TikTok hurt," Frazier, who worked for the third-party company Telus International, told Insider earlier this year. "It made me lose my trust in people," she added.
TikTok is hardly the only social-media giant to have questions raised about its CSAM moderation practices. Meta earlier this year came under fire after a report from The New York Times said moderators working on behalf of the company were told to "err on the side of an adult" when they can't tell someone's age in a photo or video. A company spokesperson confirmed the policy to the Times.
Moderators for social-media behemoths have long sounded the alarm on the perils of moderating disturbing content for platforms. The Verge in 2019 reported that some moderators working to moderate content for Facebook were developing symptoms similar to post-traumatic stress disorder after leaving the job. A former content moderator in Kenya is currently suing Meta and one of its subcontractors, alleging that the company engaged in human trafficking and forced labor.
If you are a moderator for TikTok or any social-platform and wish to speak to this reporter, email him at cperrett@insider.com.