Facebook has no idea how far COVID-19 vaccine misinformation has spread.
- Facebook data scientists reportedly asked for resources to measure COVID misinformation.
- But the company did not approve of their request, according to the New York Times.
- The report shows how Facebook does not know how widespread vaccine misinformation is on its platform.
Facebook isn't able to exactly measure how much misinformation exists on its platform - or how much of it is seen by its users.
And according to The New York Times, data scientists asked the company for help at the start of the pandemic so they can figure out how to do so. They said, per the outlet, that it could take at least a year to develop such a tool.
But sources told the Times that company executives never gave them the green light for the extra resources.
The report suggests that Facebook passed on a chance to more heavily monitor falsehoods before anti-vaccine misinformation flooded the platform. Fast forward a year, and the company is facing mounting pressure from the Biden administration and the public over what critics say is allowing misinformation to proliferate.
Facebook did not immediately respond to Insider's request for comment. The company told the Times that "the suggestion we haven't put resources toward combating Covid misinformation and supporting the vaccine rollout is just not supported by the facts."
Last week, US Surgeon General Vivek Murthy declared anti-vaccine misinformation on social media to be "a serious threat to public health." And White House Chief of Staff Ron Klain also said recently that Facebook is "a giant source" of vaccine misinformation.
"I've told Mark Zuckerberg directly that when we gather groups of people who are not vaccinated, and we ask them, 'Why aren't you vaccinated?' and they tell us things that are wrong, tell us things that are untrue, and we ask them where they've heard that. The most common answer is Facebook," Klain said.
Things reached a boiling point when President Joe Biden said last week that Facebook and internet platforms like it are "killing people" by allowing false information to exist. He later walked back those comments after Facebook published a blog calling for an end to "finger-pointing."
"The fact is that vaccine acceptance among Facebook users in the US has increased," the company said in the blog. "These and other facts tell a very different story to the one promoted by the administration in recent days."
Facebook has said that it has removed more than 18 million pieces of COVID misinformation "as well as accounts that repeatedly break these rules" and has "connected more than 2 billion people to reliable information about COVID-19 and COVID vaccines across our apps."
Still, there's no way to know how many people saw misinformation first - or if they later were swayed by Facebook's reliable sources.