84% of medical misinformation on Facebook is never tagged with a warning, and is viewed billions of times, report says
- 84% of medical misinformation posts on Facebook were left up with no warning label, despite the platform's policy to counter bogus claims, an investigation found.
- Non-profit organization Avaaz said 16% of the posts it examined that had been fact checked were labelled, while the other 84% were not.
- It said bogus claims were viewed some 3.8 billion times, with an especially large audience as the coronavirus pandemic first began to spread.
- Facebook said Avaaz's report did "not reflect the steps we've taken" to counter misinformation, and argued that it was working to circulate "credible health information."
A large majority of Facebook posts containing medical misinformation — 84% — were left online with no labels or warnings, according to a report by the group Avaaz.
It said that its survey of bogus medical claims and advice on the platform found that only 16% of posts were given a label highlighting their contents as untrue, unproven, or harmful. The other 84% were not.
The report by Avaaz, a US non-profit, looked at 174 pieces of content that were fact-checked by a credible third party and found to contain health misinformation.
Avaaz found that many posts, some of which reached millions of people, managed to avoid being labelled by Facebook by reposting content from other pages or translating it into other languages.
Avaaz found that in the last year misinformation about health has been viewed 3.8 billion times on Facebook across at least five countries — the US, the UK, France, Germany, and Italy. It said the volume peaked in April, as the coronavirus pandemic spread quickly around the world.
The group concluded that Facebook poses a "major threat" to public health.
It found that content from the 10 biggest websites for spreading health misinformation had almost four times the Facebook views as content from the 10 large health bodies, like the World Health Organisation (WHO) and the US Centers for Disease Control and Prevention (CDC).
The group said that Facebook should put independent and fact-checked corrections alongside the misinformation on the platform, and said this could reduce people's belief in the misinformation by an average of almost 50%.
And it said Facebook should alter its algorithm to reduce the reach of misinformation by 80%.
Avaaz said: "Facebook has yet to effectively apply these solutions at the scale and sophistication needed to defeat this infodemic, despite repeated calls from doctors and health experts to do so."
The coronavirus pandemic has put Facebook under a new spotlight, as false and misleading information about the virus, its source, vaccines, cures, and what role major figures are spreading spreads across social media platforms.
Facebook has taken down conspiratorial videos, given warnings to people who may have spread misinformation, and has taken down anti-lockdown event pages.
But Avaaz said the prevalence of misinformation despite these measures show that "even the most ambitious among Facebook's strategies are falling short of what is needed to effectively protect society."
It called its investigation "one of the first to measure the extent to which Facebook's efforts to combat vaccine and health misinformation on its platform have been successful, both before and during its biggest test yet: the coronavirus pandemic.
Facebook rebuffed Avaaz's findings to the BBC, saying they did "not reflect the steps we've taken"
"We share Avaaz's goal of limiting misinformation. Thanks to our global network of fact-checkers, from April to June, we applied warning labels to 98 million pieces of Covid-19 misinformation and removed seven million pieces of content that could lead to imminent harm.
"We've directed over two billion people to resources from health authorities and when someone tries to share a link about COVID-19, we show them a pop-up to connect them with credible health information."