scorecard
  1. Home
  2. tech
  3. Mark Zuckerberg wants an independent council of non-Facebookers to make the final call on removing dangerous content, 'almost like a Supreme Court'

Mark Zuckerberg wants an independent council of non-Facebookers to make the final call on removing dangerous content, 'almost like a Supreme Court'

Avery Hartmans   

Mark Zuckerberg wants an independent council of non-Facebookers to make the final call on removing dangerous content, 'almost like a Supreme Court'
Tech2 min read

Mark Zuckerberg

Justin Sullivan/Getty Images

  • Mark Zuckerberg wants to form an independent council that reviews Facebook posts that have been flagged as harmful or abusive.
  • The council would work "almost like a Supreme Court," and would allow users to make an appeal when their posts are removed from the site.
  • Facebook already has an internal team to moderate abuse, and has said it plans to double that team to 20,000 employees in 2018.

Mark Zuckerberg has a new idea for how to deal with abusive content on Facebook: a Supreme Court.

In an interview with Vox's Ezra Klein, Zuckerberg outlined his idea for how Facebook can better manage content that's been flagged as harmful or abusive. Zuckerberg said he believes Facebook's first step should be to build an appeals process that would work similarly to the federal government.

Here's how Zuckerberg described it (emphasis ours):

"But over the long term, what I'd really like to get to is an independent appeal. So maybe folks at Facebook make the first decision based on the community standards that are outlined, and then people can get a second opinion. You can imagine some sort of structure, almost like a Supreme Court, that is made up of independent folks who don't work for Facebook, who ultimately make the final judgment call on what should be acceptable speech in a community that reflects the social norms and values of people all around the world."

In other words, Zuckerberg wants an independent team to help decide what's acceptable speech and what isn't.

Facebook already has an internal Community Operations and Review Team, which includes content reviewers. Under the current system, any Facebook posts that have been reported are reviewed by that team. If they find a post violates Facebook's terms of service, the post is automatically taken down and the user who posted it can't appeal it.

Facebook said it plans on doubling its internal safety and security team to 20,000 people in 2018, according to ProPublica.

You can read more from Klein's interview with Zuckerberg over on Vox.

READ MORE ARTICLES ON


Advertisement

Advertisement