Facebook bans Holocaust-denial content after allowing it for years
- Facebook announced Monday it was changing its hate speech policy to "prohibit any content that denies or distorts the Holocaust."
- The company has faced criticism for more than a decade over its refusal to moderate anti-Semitic content that distorts or denies the Holocaust, when Nazis and their allies systematically killed 6 million Jews, happened.
- In the weeks leading up to the 2020 presidential election, Facebook has attempted to mitigate criticism that it fails to prevent the spread of dangerous conspiracy theories and disinformation on its platform. Just last week, Facebook said it banned QAnon accounts across its platforms.
Facebook has banned Holocaust-denial content from the platform after years of criticism over its refusal to take action against such anti-Semitic rhetoric.
Facebook announced Monday it was updating its hate speech policy to "prohibit any content that denies or distorts the Holocaust."
The policy change marks an abrupt about-face on Facebook's refusal, for more than a decade, to remove content from its platform that denies the existence of the Holocaust and the genocide of millions of Jews and other minority groups. The platform has faced pressure from human rights and civil rights groups to take a stricter stance against such content, but Facebook has maintained that the "mere statement" of Holocaust denial doesn't violate policies.
"I'm Jewish, and there's a set of people who deny that the Holocaust happened. I find that deeply offensive," CEO Mark Zuckerberg told Recode in July 2018. "But at the end of the day, I don't believe that our platform should take that down because I think there are things that different people get wrong."
In the meantime, it appears that Holocaust-denial content on Facebook has continued to not just exist, but flourish. A recent study, published in August by the British think tank Institute for Strategic Dialogue, found that Facebook "actively promotes" Holocaust denial content to users who had previously interacted with such material on the platform.
Critics of Facebook say the company has failed to moderate its platform and prevent the spread of dangerous misinformation and conspiracy theories. A report from The Wall Street Journal earlier this year revealed that Facebook executives knew its platforms' algorithms encourage divisiveness — and chose to not do anything to change it.
But in the weeks ahead of the 2020 presidential election, it appears Facebook is trying to show it's cracking down. The company announced last week it was banning accounts related to the conspiracy theory QAnon, which has become a rallying point for some on the far right and led to violence.
Got a tip? Do you work at Facebook? Contact Paige Leskin via email (pleskin@businessinsider.com), or Twitter DM (@paigeleskin). You can also message her securely on encrypted messaging app Signal (+1 201-312-4526).