TikTok reportedly recommends explicit videos about drugs and sex to its youngest users.- An investigation by The Wall Street Journal found hundreds of adult videos on For You Pages for users aged 13-15.
- A TikTok spokesperson said the app doesn't differentiate between the videos it recommends to adults versus
minors .
TikTok dishes out drug-related and sexually explicit content to minors, a recent investigation by The Wall Street Journal found.
Using dozens of automated bots that were registered as users between the ages of 13 and 15, the publication found that TikTok provided the accounts with scores of videos promoting rape, drugs, and sexual fantasies, including some depicting caregivers and children.
Teens make up the largest group of TikTok's about 100 million monthly active users. Last year, minors accounted for over a quarter of the app's users, according to data from the company.
An earlier investigation by The Journal found that TikTok curates a user's For You Page based on the content that a user lingers on within their feed. Using the same methodology, the bots that lingered on content featuring drugs, quickly saw their For You Page become taken over by nearly 600 videos featuring drug-related content. The Journal said the page went down a rabbit hole of content advertising how to get drugs, as well as outside links to web pages selling illegal substances.
Similarly, bots that lingered on more sexual content became bombarded by videos on sexual power dynamics and violence, as well as links to outside pages for porn like OnlyFans.The publication said one bot's For You Page became so focused on what many call "Kinktok" that 90% of its videos became about sex and bondage. Many of the sexually explicit videos featured tags indicating they were designed for "adults only."
Some of the content The Journal encountered is banned by the platform per TikTok's community guidelines. The publication said hundreds of the videos were removed from the platform before it could share them with TikTok, but it shared 974 examples of the explicit content with the company.
A TikTok spokesperson did not respond to a request for comment from Insider in time for publication, but told The Journal the company declined to comment on the individual video content. The spokesperson said the majority of the videos do not violate TikTok's policies. Though, the company told The Journal it removed some of the videos after the publication alerted the company to them and restricted the distribution of some of the other videos.
The spokesperson also said the app doesn't differentiate between videos it serves to adults and minors. Though, the platform is looking to create a tool that filters content for young users.
In July, an Insider investigation found that TikTok's algorithm auto-suggests content surrounding eating disorders that appeared to violate TikTok's community