An internal Facebook audit reportedly shows QAnon groups have millions of members, but some employees who ran the investigation fear the company won't take any action
- As part of an internal investigation probing communities with potential ties to violence on its site, Facebook found millions of followers across thousands of groups associated with the far-right fringe conspiracy theory QAnon, per an NBC News report.
- However, some Facebook employees fear the company won't take the necessary action against the 3 million online QAnon group members, according to internal documents viewed by NBC News.
- The employees also expressed concern that QAnon's presence on the popular social network might influence the upcoming 2020 presidential election.
- Facebook has received pushback for its hands-off approach to content moderation in the past. It also isn't the only social media firm that has cracked down on QAnon content recently — Twitter and TikTok have also made similar strides.
Facebook reportedly led an internal investigation into QAnon, which turned up evidence that the conspiracy theory may have reached millions of users, through thousands of groups on the platform.
Internal company documents viewed by NBC News show that Facebook's scrutiny of QAnon's spread was just one aspect of a broader scan of communities on the platform with possible ties to violence, including "militias and other violent social movements."
The investigation into the communities was launched to help Facebook in deciding how to address QAnon's presence on the platform. Per the report, one option could be for the social media firm to avoid amplifying QAnon group pages in its recommendations reel. The company could also ban advertising associated with the far-right movement.
One finding in the preliminary report showed 185 ads for merchandise and demonstrations that were "praising, supporting, or representing" QAnon, according to NBC News, which, over a 30-day period, turned $12,000 into the company's pockets and generated 4 million impressions.
But anonymous Facebook employees involved in the investigation told NBC News they don't think the company will implement outright bans of QAnon groups and will instead respond instead with weaker actions. The employees reportedly also said that there is concern at the company over how much influence QAnon's Facebook presence could have on the upcoming 2020 presidential election.
Since Facebook's Groups feature has been used to form a broad range of communities, but it's also served as a meeting ground for radical groups, including QAnon supporters, as NBC News notes.
In an emailed statement to Business Insider, a Facebook spokesperson said, "enforcing against QAnon on Facebook is not new: we consistently take action against accounts, Groups, and Pages tied to QAnon that break our rules. Just last week, we removed a large Group with QAnon affiliations for violating our content policies, and removed a network of accounts for violating our policies against coordinated inauthentic behavior. We have teams assessing our policies against QAnon and are currently exploring additional actions we can take."
QAnon is a far-right movement whose members support the unfounded belief that a secret coalition of powerful figures is targeting President Donald Trump. QAnon members are largely supporters of the president and have circulated disproven theories in the past surrounding President Barack Obama and former Secretary of State Hillary Clinton, including that they practice Satanism and are involved in a global pedophilia ring.
Facebook has faced scrutiny over its laissez-faire approach to content moderation, and employees have raised concerns about it in the past. Just last week, Facebook also said it removed a QAnon group page that had more than 200,000 members after it found they were "repeatedly posting content that violated our policies."
Facebook isn't the only social media company that has cracked down on QAnon content. Twitter said in July that it was zeroing in on "so-called QAnon activity" and reportedly removed 7,000 accounts associated with content pertaining to the movement. TikTok disabled two popular hashtags associated with QAnon in late July as well.
The FBI has also warned that conspiracy theories pose domestic terrorism threats. The bureau identified how an individual's belief in conspiracy theories or hoaxes may have or did result in violence, citing the Tree of Life synagogue shooting and the QAnon conspiracy.