scorecard
  1. Home
  2. tech
  3. news
  4. Facebook expanded its rules on posting misinformation and will remove all false claims about COVID vaccines, including that they cause autism

Facebook expanded its rules on posting misinformation and will remove all false claims about COVID vaccines, including that they cause autism

Natasha Dailey   

Facebook expanded its rules on posting misinformation and will remove all false claims about COVID vaccines, including that they cause autism
  • Facebook added false vaccine claims to its updated protocols for combatting COVID-19 misinformation.
  • Posts claiming vaccines are ineffective or unsafe will be removed, the social network said.
  • Facebook also said it is working with WHO to remove misinformation about all vaccines.

Facebook said it will remove misinformation about the COVID-19 vaccines from the platform and from Instagram in order to continue combatting false claims about the pandemic.

On Monday, Facebook updated the list of misinformation it would remove from its social sites to include several false claims about the COVID-19 vaccines, including that they're "toxic, dangerous or cause autism." Since February 2020, the social media giant has been removing false claims about the virus.

Posts will be removed if they include claims that the COVID-19 vaccines will kill or seriously harm people, will cause autism or infertility, will change people's DNA, or will cause irrational side effects like turning a person into a monkey. Other false claims will also be removed, like those that say contracting the disease is safer than getting the vaccine and that receiving the shot is unsafe for certain groups of people. Facebook will take down false statements about how COVID-19 vaccines were made or their efficacy

Facebook also will remove misinformation about all vaccines, like that they cause infant death or can be poisonous. The company said it consulted the World Health Organization and other leading health groups to determine the list of false claims.

Read more: What it's like when one of your oldest friends becomes a conspiracy theorist

Facebook has updated its misinformation guidelines regarding COVID-19 several times since the start of the pandemic last year. At first, the social network reduced visibility of false claims about the virus by limiting distribution and adding warning labels with more context. In April alone, the company put warning labels on 50 million pieces of content.

It has since begun removing false claims entirely. The company said it has taken down "more than 12 million pieces of content on Facebook and Instagram containing misinformation that could lead to imminent physical harm."

The company is also giving $120 million in advertising credits to health groups working to reach billions of people with information regarding COVID-19 and the vaccines. It will also help users determine where and when they can receive their vaccine, similar to its tool in helping people determine when and where they can vote.

"In 2021 we're focused on supporting health leaders and public officials in their work to vaccinate billions of people against COVID-19," Facebook said in its statement.

In the past year, the social media platform has responded to calls to address the growing spread of misinformation. In October, it banned pages and groups associated with the conspiracy theory QAnon, and then in January, it placed an indefinite ban on former President Donald Trump, who stoked claims about a stolen presidential election, which eventually led to the deadly Capitol riots on January 6.

READ MORE ARTICLES ON



Popular Right Now



Advertisement