scorecard
  1. Home
  2. tech
  3. news
  4. Why Facebook categorizes phrases like 'men are trash' as hate speech, according to Mark Zuckerberg

Why Facebook categorizes phrases like 'men are trash' as hate speech, according to Mark Zuckerberg

Mary Meisenzahl   

Why Facebook categorizes phrases like 'men are trash' as hate speech, according to Mark Zuckerberg
Tech3 min read

FILE PHOTO: Facebook CEO Mark Zuckerberg makes his keynote speech during Facebook Inc's annual F8 developers conference in San Jose, California, U.S., April 30, 2019. REUTERS/Stephen Lam

Reuters

Facebook CEO Mark Zuckerberg.

  • The Verge published leaked audio from a July all-hands Facebook meeting, which are usually kept internal to the company.
  • Reporter Casey Newton released another transcript in The Interface, his newsletter, on Wednesday.
  • In the new transcript, Facebook CEO Mark Zuckerberg addresses the company's hate speech policies, including the phrase "men are trash."
  • "So substitute in your mind while you're thinking through this, what if this were 'Muslims are trash,' right? You would not want that on the service," Zuckerberg said.
  • Visit Business Insider's homepage for more stories.

On Tuesday, The Verge published highlights from leaked audio from a July Facebook Q&A between CEO Mark Zuckerberg and employees. Verge reporter Casey Newton released another selection from the audio transcript in his newsletter, The Interface, on Wednesday, in which one employee asked about the specifics of the platform's hate speech policy.

"Question: According to your policies 'men are trash' is considered tier-one hate speech. So what that means is that our classifiers are able to automatically delete most of the posts or comments that have this phrase in it. Why?"

In response, Zuckerberg gave one of the most detailed explanations of Facebook's content moderation policy to date. To answer the question, Zuckerberg called the hate speech policies "fraught," and gave a line of reasoning to justify the categorization.

"So one is, gender is a protected category," Zuckerberg said. "So substitute in your mind while you're thinking through this, what if this were 'Muslims are trash,' right? You would not want that on the service."

He went on to say that the company needs to have extremely specific protocols to get consistent responses from the 30,000-plus content moderators around the world. Zuckerberg noted that some people might want a policy that distinguishes between historically oppressed groups, maybe with different positions on "men are trash," versus "woman are trash."

Essentially, Zuckerberg believes Facebook shouldn't be in the business of assessing whether a group has been oppressed, bec aue Facebook has too much content to apply any kind of nuance or context to moderating individual comments.

Read more: Mark Zuckerberg reacted to the leaked transcript of his internal meeting by promoting it on his Facebook page

He called this policy "a principled approach for having a global framework that is actually enforceable around the world," and claimed that most user complaints come from moderators making mistakes and not applying the rules as they were intended, rather than issues with the rules themselves.

In the past, Facebook's hate speech policy has been unclear, and the company has faced criticism for keeping up violent or offensive posts. The lack of transparency about what Facebook would allow on the platform left users guessing at the actual policy. In 2017, ProPublica asked Facebook about 49 potential instances of hate speech, and the company admitted that moderators made mistakes on 22 of them. The New York Times published a quiz asking readers to guess which phrases violated Facebook's policy.

In the past year, stories have come out about Facebook content moderators seeing disturbing content including violent deaths, extreme hate speech, and child pornography. The Verge also reported how Facebook content moderators are tightly controlled, and their bathroom and prayer breaks are managed. Some develop anxiety while training, and many continue to suffer from trauma even after leaving the job.

When an employee asked Zuckerberg about these reports of content moderators at the July meeting, Zuckerberg was dismissive.

"Some of the reports, I think, are a little overdramatic," Zuckerberg said. "From digging into them and understanding what's going on, it's not that most people are just looking at just terrible things all day long."

You can subscribe to Newton's daily newsletter here.

READ MORE ARTICLES ON


Advertisement

Advertisement