+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

Facebook finally reveals the long-secret rules for what exactly will get you banned from the social network

Apr 24, 2018, 14:32 IST

Leah Millis/Reuters

Advertisement
  • Facebook is publishing the internal guidelines its moderators use to police the social network.
  • The rules have been shrouded in confusion and secrecy for years, though they have leaked before.
  • The new "Community Standards" are 8,500 words long and go into great detail about exactly what is and isn't allowed - from sexual and violent content, to hate speech.
  • Facebook is also adding new procedures to file an appeal when its moderators remove a post.


Facebook is finally publishing the full internal guidelines its content moderators use to police the social network.

The move, announced Tuesday, offers significant new transparency around how the company manages its 2 billion users. Moreover, it comes paired with the announcement that Facebook will shake up the process for how it handles those cases where users report potentially objectionable content.

The news comes as the company struggles to move past a string of scandals that have bruised its public image, from Facebook's use in the spread of Russian propaganda, to the now-infamous Cambridge Analytica scandal, where a political research firm improperly obtained access to as many as 87 million users' profile data.

Facebook already publishes a set of "Community Standards," a relatively brief overview of its global rules, for public consumption. But for years, Facebook has maintained a second, more thorough and intensive set of guidelines intended for its team of content moderators, that previously weren't available to Facebook's users.

Advertisement

These have leaked before - but now, for the first time, Facebook is publicly releasing an official version. The full guidelines are about 8,500 words, and include rules for what's acceptable and what's not in terms of violent, sexual, or otherwise controversial content, along with hate speech and threatening language.

This could give Facebook's billions of users their best look yet at what's acceptable, and what's not, on the global social network - potentially giving some context to what's been criticized as a sometimes-mystifying and seemingly-inconsistent process for when it chooses to ban or otherwise discipline a user.

"We want people to know about these standards, we want to give them clarity," Monika Bickert, Facebook's head of global policy management, said on a conference call with reporters ahead of the announcement.

The new rules are 8,500 words long

The complete text is a sprawling affair: A draft shared with reporters stretches to more than 8,500 words, and goes into explicit detail about what is and isn't allowed, from credible threats of violence to sexual exploitation.

Under the category "Graphic Violence," for example, prohibited content includes "enjoyment of humiliation," "erotic response to suffering," and "remarks that speak positively of the violence." Meanwhile, images of "visible innards," "charred of burning people," or "victims of cannibalism" are allowed "in a medical setting" and must be restricted so only users 18 and older can see them.

Advertisement

The only difference between the guidelines being made public on Tuesday, and those that are given to Facebook employees, are "aesthetic," Bickert said. Facebook isn't sharing any training material, but otherwise, there isn't any supplementary material that isn't being published, she said.

Facebook also expects to be update the Community Standards regularly in response to feedback, and as required.

Facebook's actual rules aren't changing - but its reporting system is

Like other social networks, Facebook has grappled with controversy over its content moderation policies.

The social network has been criticised over the spread of hate speech, its approach to users' "real" names, and for censoring famous artwork that includes nudity. It once attempted to block an iconic photo from the Vietnam War, "The Terror of War," on the grounds it shows a naked child. It even censored Norway's Prime Minister and one of its biggest newspapers before backing down.

It's important to note that the release of these guidelines doesn't actually signal a change in Facebook policy.

Advertisement

For example, Facebook's policy of banning images containing female nipples (with some exceptions, like breast-feeding), which been criticised by the #FreeTheNipple movement, remains. (Bickert defended it as being about "safety," as it can be difficult to tell if the photo subject consented to having it shared.)

The company is updating how it handles user reports, however, allowing users to appeal against content moderators' decisions about individual posts. The social network is starting by allowing people to request a "review" if one of their posts has been taken down. Later in the year, users who report a post, and are told that the post in question doesn't break the rules, will be able to appeal that decision, too.

Facebook said the decision to release its full guidelines publicly pre-dates the Cambridge Analytica scandal. It has been in progress "for a long time," said Mary deBree, Facebook's head of content policy, and the team "started drafting in earnest in the September of 2017.

NOW WATCH: Facebook's recent struggles have investors in a panic - and looming regulation could forever change how it does business

You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article