scorecard
  1. Home
  2. international
  3. news
  4. TikTok could serve as an 'amplifier of hateful ideologies,' according to new report analyzing Buffalo shooter's beliefs

TikTok could serve as an 'amplifier of hateful ideologies,' according to new report analyzing Buffalo shooter's beliefs

Kieran Press-Reynolds   

TikTok could serve as an 'amplifier of hateful ideologies,' according to new report analyzing Buffalo shooter's beliefs
International2 min read
  • Abbie Richards' new report analyzes how some of the Buffalo shooter's ideology is present on TikTok.
  • Richards found that harmful content often doesn't originate on TikTok, but it spreads on the app.

Despite attempts to curb extremism on its platform, there is still a wide array of bigoted and violent content circulating on TikTok, according to a new report from the researcher Abbie Richards.

The report, released Monday by the Global Network on Extremism and Technology, examines some of the themes present in a manifesto allegedly written by the shooter who killed 10 Black people and injured three others at a supermarket in Buffalo, New York in May. Richards concludes that core elements of the shooter's ideology and ideas — from anti-Black rhetoric to ethno-nationalism — have been or were recently present in videos on TikTok.

"Content that aligns with components of the Buffalo shooter's manifesto is so pervasive on TikTok that the pervasiveness itself has become a meme," Richards wrote in the report.

TikTok doesn't allow extremist content on its platform, but videos that violate those policies often slip through the cracks. The platform has a long history of housing and recommending racist content and overt disinformation that can gain millions of views before it's taken down.

The far-right extremist Paul Miller, for instance, appeared in a TikTok video that was online for three months, amassing millions of views and hundreds of thousands of likes, before TikTok took it offline, according to Richards' report. The situation arose because the video used hashtags that purposely misspelled Miller's old TikTok handle to evade moderation. White nationalist content, including references to the number "14" used to signal the white supremacist "14 words" slogan, are also on TikTok.

Although TikTok isn't frequently the social media platform where this extremist content originates — more often it is on fringe forums without moderation like 4chan — the app can serve as "as an amplifier of hateful ideologies," according to Richards' report, and can "connect users to radicalising content which extremists then can utilise to form communities in less moderated spaces."

Richards told Insider it's pretty common to see links to "private Discord servers and Telegram chats" in users' TikTok bios that link to extremist chat groups.

"They'll have an account where they post extremist/extremist-adjacent content and then put the link to the private group in their bio and sometimes reference the group's existence in videos, captions, or comments," Richards said.

There is also a "concerning amount" of TikTok content that instructs viewers on how to produce homemade weapons, according to the report. Back in May, Media Matters reported on a network of TikTok accounts that would release animated videos showing viewers how to make weapons like pipe bombs and molotov cocktails.

To better understand how extremist content is thriving on the platform, TikTok needs to be more transparent about their content moderation policies, Richards told Insider.

This past weekend, Richards said she noticed TikTok restricted users from searching or hashtagging "antisemitism" and "Islamophobia," even though those phrases are typically used educationally. The terms can be used now, but it's unclear why they were blocked initially.

"How was that decision greenlit in the first place? How many people were involved in that decision?" Richards said. "It's imperative that we have a better understanding of their internal practices."

READ MORE ARTICLES ON


Advertisement

Advertisement