+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

Facebook is changing its 'race-blind' hate speech algorithm to prioritize flagging content it deems 'the worst of the worst,' after backlash over removing Black users' posts

Dec 4, 2020, 01:49 IST
Business Insider
Facebook CEO Mark Zuckerberg leaving The Merrion Hotel in Dublin after a meeting with politicians to discuss regulation of social media and harmful content in April 2019.Niall Carson/PA Images via Getty Images
  • Facebook is changing the way that it flags and removes hate speech, in order to prioritize the 'worst of the worst,' such as anti-Black, LGBT, or Muslim content.
  • Facebook's proactive technology won't automatically flag posts declaring, "Men are trash," or "Americans are dumb" anymore, Facebook said.
  • Instead, the platform is placing more focus on automatically catching especially harmful content, like posts featuring Blackface, Holocaust denial, and stereotypes about Jews controlling the world.
  • "We know that hate speech targeted towards underrepresented groups can be the most harmful, which is why we have focused our technology on finding the hate speech that users and experts tell us is the most serious," a Facebook spokesperson told Business Insider.
  • The move is a break from the platform's previous 'race-blind' strategy, which some users said unfairly flagged posts made by people of color.
Advertisement

Facebook is changing the way that it polices hate speech on the platform, the Washington Post first reported Thursday.

Facebook will stop automatically flagging or removing a small subset of content attacking Americans, men, and white people, a spokesperson for the company confirmed to Business Insider. In addition, the platform is changing its hate speech algorithm to be more sensitive toward attacks on Black people, Muslims, Jews, LGBTQ people, and other minority groups, Facebook confirmed.

"We know that hate speech targeted towards underrepresented groups can be the most harmful, which is why we have focused our technology on finding the hate speech that users and experts tell us is the most serious," said Sally Aldous, Facebook spokesperson, in a statement.

"Over the past year, we've also updated our policies to catch more implicit hate speech, such as content depicting Blackface, stereotypes about Jewish people controlling the world, and banned holocaust denial," she added.

The move is a break from Facebook's previous "race-blind" hate speech policy, the Washington Post reported. Previously, Facebook did not distinguish between historically marginalized groups, like minorities, and other groups when it came to hate speech.

Advertisement

As a result of the content moderation policy, Black activists have said that their Facebook posts about racism have been unfairly flagged or removed, and that the platform is slow to address actual racist attacks, according to USA Today. Black activists began actively avoiding including the term "white," in their posts, opting for "wipipo" instead, in order to avoid having their posts taken down, the Washington Post reported.

Read more: Strict Facebook NDAs reveal how the company bars some of its ad agencies from speaking about it and even confirming public information

The project to change Facebook's hate speech enforcement is called, "The Worst of the Worst," or "WoW" for short, according to the Washington Post. Facebook has been working on it since 2019, a Facebook spokesperson told Business Insider.

While the new policy will prioritize automatically flagging anti-Black content, while de-emphasizing flagging certain anti-White content, Facebook says that the changes largely affect its "proactive technology." The platform will still remove content if someone reports it, Facebook told Business Insider.

"Thanks to significant investments in our technology we proactively detect 95% of the content we remove and we continue to improve how we enforce our rules as hate speech evolves over time," Aldous said in a statement.

Advertisement
You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article