Democrats attack Facebook, YouTube, and Twitter over the Capitol riots, saying the tech giants play 'whack-a-mole' with radicalization
- House Democrats wrote to Facebook, Google, and Twitter accusing them of fostering extremism.
- They compared the platforms' approach to moderating hateful and dangerous content to "whack-a-mole."
- Lawmakers recommended each platform make sweeping design changes to stop chasing engagement, which they said fosters extremism.
On the first full day of President Joe Biden's administration, House Democrats turned their attention to social media.
Rep. Anna Eshoo and Rep. Tom Malinowski on Thursday wrote letters to the CEOs of Facebook, Google, and Twitter accusing them of radicalizing people, resulting in the kind of fanaticism that led to a mob storming the US Capitol.
"On Wednesday, January 6th the United States Capitol was attacked by a violent, insurrectionist mob radicalized in part in a digital echo chamber that your company designed, built, and maintained," they wrote.
Although Eshoo and Malinowski are the lead authors, each letter is undersigned by 38 other members of Congress.
Read more: Parler should be taken seriously as a hotbed of extremism and conspiracy theories, a new study shows
The lawmakers claimed platforms are not doing enough to crush extremism on their platforms. Using moderation to take dangerous content down is relying on a "whack-a-mole answer to a systemic problem," they said.
The lawmakers argue significant design changes need to be made on Facebook, YouTube, which is owned by Google, and Twitter to stop the platforms spurring user engagement.
Although all the companies were told their platforms were in part responsible, Facebook appeared to be where the lawmakers laid the most blame.
"Perhaps no single entity is more responsible for the spread of dangerous conspiracy theories at scale or for inflaming anti-government grievance than the one that you started and that you oversee today as Chairman and Chief Executive Officer," they wrote to Mark Zuckerberg.
Their recommendations for how Facebook should change revolved mainly around temporary measures that Facebook deployed during the election, which it has now scrapped.
They pointed to an experiment the platform ran ahead of the election in an effort to fight misinformation, in which it tweaked its News Feed algorithm to prioritize established news sites and demote fringe, partisan sites - resulting in what it called a "nicer feed."
The lawmakers' letter also highlighted that Facebook stopped automatic recommendations for political and social-issue groups ahead of the presidential election.
In a press conference last week, Facebook's COO Sheryl Sandberg said the Capitol mob had mostly organised on other sites.
"Our enforcement is never perfect, so I'm sure there were still things on Facebook. I think these events were largely organized on platforms that don't have our abilities to stop hate, and don't have our standards, and don't have our transparency," she said.
YouTube
In their letter to Google's CEO Sundar Pichai and YouTube's CEO Susan Wojcicki, the Representatives asked the leaders to permanently disable YouTube's autoplay feature, which automatically plays recommended videos for users once they've finished a video.
They also asked YouTube to stop recommending videos containing conspiracy theories altogether. "If those are too difficult to identify using automated processes, the company should cease all recommendations until an effective, technical solution is developed," they added.
YouTube announced in 2019 it was changing its recommendation algorithm to crack down on "borderline content" featuring conspiracy theories.
Independent researchers studying the recommendation algorithm in the run-up to the election found it had effectively reduced the number of fringe misinformation videos in comparison to the 2016 election. It also appeared to have the unexpected side effect of boosting Fox News videos.
In their letter to Twitter, the Representatives didn't make specific recommendations, although as with Facebook they highlighted temporary measures Twitter implemented around the election that curbed misinformation.
"Experts have rightly suggested that the platform needs to make permanent, fundamental design changes to limit the spread of harmful content, such as halting recommendations, limiting shares, and adding a circuit breaker-like function to slow the spread of the most viral and potentially dangerous content," the letter says, citing a Washington Post op-ed from earlier this month.
"It is our hope that Twitter will immediately make permanent changes to limit the spread of misinformation and other forms of harmful content, and that the company will begin a fundamental reexamination of maximizing user engagement as the basis for algorithmic sorting and recommendation," they added.
A Twitter spokesperson told Insider the company plans to respond to the letter. Facebook and Google did not immediately respond when asked for comment by Insider.