scorecard
  1. Home
  2. tech
  3. Facebook is scrapping its system of flagging fake news because it had 'the opposite effect to what we intended'

Facebook is scrapping its system of flagging fake news because it had 'the opposite effect to what we intended'

Shona Ghosh   

Facebook is scrapping its system of flagging fake news because it had 'the opposite effect to what we intended'

Trump fake news

Scott Olson/Getty

An anti-Trump protester references the president's love of the phrase 'Fake news'

  • Facebook has been trying to combat the spread of misinformation on its platform through measures like putting a "Disputed" flag next to fake news.
  • The company will ditch this particular measure after finding this can actually entrench people more strongly in wrongly held beliefs.
  • The social media firm will stick to pointing people who read or share fake news to fact-checked, contextual articles.


Stopping people from reading fake news is proving tougher than expected, Facebook has admitted.

The company is ditching one of its original measures to tackle the spread of fake news on Facebook, which was to stick the word "disputed" next to misleading information.

But the tactic didn't design for human pigheadedness where, when you tell someone their opinion is wrong, they simply entrench further in that opinion.

Facebook wrote in two explanatory blog posts that it would drop the "disputed" tag and provide contextual information next to fake news through its existing "Related articles" feature.

According one of the posts, emphasis ours:

"Academic research on correcting misinformation has shown that putting a strong image, like a red flag, next to an article may actually entrench deeply held beliefs - the opposite effect to what we intended. Related Articles, by contrast, are simply designed to give more context, which our research has shown is a more effective way to help people get to the facts. Indeed, we've found that when we show Related Articles next to a false news story, it leads to fewer shares than when the Disputed Flag is shown."

In a separate Medium post, the three Facebook staffers leading the firm's efforts against fake news wrote that giving people more context meant they shared less fake news. Unfortunately, neither the disputed tags or the additional context stopped people from actually clicking on fake news.

"During these tests, we learned that although click-through rates on the hoax article don't meaningfully change between the two treatments, we did find that the Related Articles treatment led to fewer shares of the hoax article than the disputed flag treatment."

The trio cited academic research which shows giving context to fake news helped "reduce misperceptions."

Facebook will still use some other tactics it was trialling: It will still use fact-checkers to determine the accuracy of articles, reduce the distribution of fake news, and send alerts to people who have shared disputed stories with extra context.

READ MORE ARTICLES ON



Popular Right Now



Advertisement