+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

QAnon networks are evading Twitter's crackdown on disinformation to pump out pro-Capitol-riot propaganda, study says

Jan 6, 2022, 18:32 IST
Business Insider
A person wears a QAnon sweatshirt during a pro-Trump rally on October 3, 2020 in the borough of Staten Island in New York City.Stephanie Keith/Getty Images
  • QAnon accounts are evading Twitter's attempts to ban them, per research seen by Insider.
  • Academic Laura Dilley described strategies that seem to skirt Twitter's enforcement efforts.
Advertisement

Networks of QAnon accounts are using unusual tactics to evade Twitter's ban on disinformation and flood the platform with conspiracy theories, a study shared with Insider found.

Twitter has sought to prevent the QAnon movement from operating on its platform, purging more than 70,000 accounts in the wake of the Capitol riot on January 6, 2021.

But, the study showed, large networks of supporters and influencers are continuing to operate there.

Laura Dilley an associate professor in the Department of Communicative Sciences and Disorders at the University of Michigan, tracked their workings and shared the findings with Insider.

Hashtag swapping, pop-up accounts and hard-to-track images

Dilley followed the rise of 4 networks promoting QAnon propaganda, operating from August 2020 to the present. Some of the constituent accounts were removed by Twitter in that timeframe, but many remained operative, Dilley told Insider.

Advertisement

The most prominent network had 1,500 accounts, producing messages clustering around several core themes.

They include false claims about the January 6 insurrection, conspiracy theories that the 2020 election was stolen from Donald Trump, and a selection of far-right talking points. Many were closely tied with networks of white nationalist accounts.

Other smaller ones clustered around wellness, or spirituality themes, highlighting how the political aspect of the movement overlaps with people hostile to mainstream scientific narratives.

Dilley listed techniques she said were commonly used to evade Twitter bans:

  • Replacing banned accounts with new ones under near-identical names
  • Communicating QAnon messages via images, which are much harder to track and regulate.
  • Using hashtags and phrases with small textual variations to evade automated bans.

"The networks were clearly fairly dynamic in their ability to change hashtags on the fly, for example WWG1WGA [a popular QAnon slogan meaning "Where we go one we go all"] is changed to WWGiWGA, which though a slight variant won't be picked up in automated searches for banned hashtags," said Dilley.

Advertisement

The QAnon movement emerged in 2017, coalescing around the conspiracy theory that a child-abuse ring was being run by elites linked to the Democratic Party. Adherents revere Donald Trump as a saviour figure.

After the January 6 insurrection there was a sweeping crackdown on the movement by the platform. QAnon adherents were on the front line at the Capitol attack, and the and the movement had vigorously embraced the false election-fraud claims that inspired the riot.

Many were able to maintain a presence on the platform despite the crackdown, and rebuild their networks rapidly, partly using backup accounts.

The users were able to make new profiles with "similar or identical profile pictures, often with Twitter handles that were variants of suspended account handle names," Dilley wrote.

"Digital astroturfing"

Dilley found that the networks often posted the same messages at the same time. Others were able to rapidly gain massive followings after similarly-named accounts were banned. This, to her, was a sign many accounts were so-called bots, automated accounts operating in a coordinated cluster rather than being run by real people.

Advertisement

The use of automated accounts to spread disinformation is banned by the Twitter.

Dilley called their coordinated activity "digital astroturfing", an allusion to covert "astroturfing" political campaigns that are designed to create an illusion of grassroots activism.

"This is the first research to definitively show evidence of digital astroturfing in the online promotion of QAnon on Twitter. Further, the research establishes that QAnon promotional activity on Twitter was closely linked with and indeed promoted by a wide variety of networks that span white nationalism," Dilley wrote in the study.

A source at Twitter disputed that large-scale automation was being allowed on the platform. The source requested anonymity, telling Insider that commenting by name on such issues often provokes death threats.

The source said the company's efforts to suppress QAnon were made complex by it not being a single organization, like a terror group, but a set of overlapping conspiracy theories.

Advertisement

Dilley's study builds on a 2020 Insider report that found pro-Trump operative Jason Sullivan, who billed himself "The Wizard of Twitter", operating an app that allowed users to give over their accounts to post coordinated messages.

The app evaded Twitter's bot ban because the accounts were mostly behaving in authentic ways, but could behave with the coordination of bots for brief periods to push a desired message.

Dilley suggested that the networks in her study could be using something similar to work together.

She said that straight after starting up new accounts, QAnon influencers were able to rapidly gain large followings, further suggesting automation.

The Twitter source said that they had detected no evidence of large-scale automation in the networks. Banned accounts could quickly gain large followings via backup accounts by coordinating on other platforms such as Telegram, said the source.

Advertisement

Help from Russia?

Parts of the networks, Dilley said, appear to be getting significant support from Russia.

The accounts, some of which had tens of thousands of followers, were designed to appear as though they belonged to Trump-supporting Americans.

They were highly active across all of the networks identified in the study, promoting QAnon propaganda in English. However the accounts would occasionally start tweeting in Russian, or use its Cyrillic script.

One account unmasked itself as Russian seemingly by accident, Dilley said. The account posted the Russian word for fire, where it appeared to mean to have posted a fire emoji instead, said Dilley. The message was soon deleted, and the account resumed communicating in English.

According to reports, Russian intelligence has sought to boost the QAnon movement. It may even have helped to seed the "pizzagate" conspiracy theory, a key precursor of the movement, Rolling Stone reported in 2017.

Advertisement

The Twitter source pushed back against suggestions that Russian security agencies were involved. They said that QAnon was a highly global movement, and that its claims are circulated and promoted by users in various countries.

They said that the accounts identified as possible Russian actors by Dilley were more likely ordinary Russians engaging with QAnon themes.

Dilley said that the study exposes serious failings by Twitter to halt disinformation on its platform in the wake of the Jan 6 riot.

"Unless Twitter gets serious in its commitment to keeping permanently banned individuals off of its platform, the company will continue to be morally culpable for enabling activities of spreading disinformation, fomenting civil unrest, and undermining democracy," she said.

Twitter did not provide a response on the record.

Advertisement
You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article