+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

A top Facebook exec told a whistleblower her concerns about widespread state-sponsored disinformation meant she had 'job security'

Apr 14, 2021, 11:30 IST
Business Insider
In this April 11, 2018, file photo, Facebook CEO Mark Zuckerberg pauses while testifying before a House Energy and Commerce hearing on Capitol Hill in Washington.AP Photo/Andrew Harnik, File
  • Facebook let dictators generate fake support despite employees' warnings, The Guardian reported.
  • Whistleblower Sophie Zhang repeatedly raised concerns to integrity chief Guy Rosen and other execs.
  • But Rosen said the amount of disinformation on the platform meant "job security" for Zhang.
Advertisement

Facebook allowed authoritarian governments to use its platform to generate fake support for their regimes for months despite warnings from employees about the disinformation campaigns, an investigation from The Guardian revealed this week.

A loophole in Facebook's policies allowed government officials around the world to create unlimited amounts of fake "pages" which, unlike user profiles, don't have to correspond to an actual person - but could still like, comment on, react to, and share content, The Guardian reported.

That loophole let governments spin up armies of what looked like real users who could then artificially generate support for and amplify pro-government content, what The Guardian called "the digital equivalent of bussing in a fake crowd for a speech."

Sophie Zhang, a former Facebook data scientist on the company's integrity team, blew the whistle dozens of times about the loophole, warning Facebook executives including vice president of integrity Guy Rosen, airing many of her concerns, according to The Guardian.

BuzzFeed News previously reported on Zhang's "badge post" - a tradition where departing employees post an internal farewell message to coworkers.

Advertisement

But one of Zhang's biggest concerns was that Facebook wasn't paying enough attention to coordinated disinformation networks in authoritarian countries, such as Honduras and Azerbaijan, where elections are less free and more susceptible to state-sponsored disinformation campaigns, The Guardian's investigation revealed.

Facebook waited 344 days after employees sounded the alarm to take action in the Honduras case, and 426 days in Azerbaijan, and in some cases took no action, the investigation found.

But when she raised her concerns about Facebook's inaction in Honduras to Rosen, he dismissed her concerns.

"We have literally hundreds or thousands of types of abuse (job security on integrity eh!)," Rosen told Zhang in April 2019, according to The Guardian, adding: "That's why we should start from the end (top countries, top priority areas, things driving prevalence, etc) and try to somewhat work our way down."

Rosen told Zhang he agreed with Facebook's priority areas, which included the US, Western Europe, and "foreign adversaries such as Russia/Iran/etc," according to The Guardian.

Advertisement

Facebook spokesperson Liz Bourgeois disputed that Rosen dismissed Zhang's concerns, saying that "he asked her to share a summary of her findings so he could help."

However, it took Facebook another three months after that exchange to take down the Honduras network, which The Guardian reported happened in April 2019.

"We fundamentally disagree with Ms. Zhang's characterization of our priorities and efforts to root out abuse on our platform. We aggressively go after abuse around the world and have specialized teams focused on this work," Bourgeois told Insider in a statement.

"As a result, we've already taken down more than 100 networks of coordinated inauthentic behavior. Around half of them were domestic networks that operated in countries around the world, including those in Latin America, the Middle East and North Africa, and in the Asia Pacific region. Combatting coordinated inauthentic behavior is our priority. We're also addressing the problems of spam and fake engagement. We investigate each issue before taking action or making public claims about them," she said.

However, Facebook didn't dispute any of Zhang's factual claims in The Guardian's investigation.

Advertisement

Facebook pledged to tackle election-related misinformation and disinformation after the Cambridge Analytica scandal and Russia's use of its platform to sow division among American voters ahead of the 2016 US presidential elections.

"Since then, we've focused on improving our defenses and making it much harder for anyone to interfere in elections," CEO Mark Zuckerberg wrote in a 2018 op-ed for The Washington Post.

"Key to our efforts has been finding and removing fake accounts - the source of much of the abuse, including misinformation. Bad actors can use computers to generate these in bulk. But with advances in artificial intelligence, we now block millions of fake accounts every day as they are being created so they can't be used to spread spam, false news or inauthentic ads," Zuckerberg added.

But The Guardian's investigation showed Facebook is still delaying or refusing to take action against state-sponsored disinformation campaigns in dozens of countries, with thousands of fake accounts, creating hundreds of thousands of fake likes.

And even in supposedly high-priority areas, like the US, researchers have found Facebook has allowed key disinformation sources to expand their reach over the years.

Advertisement

A March report from Avaaz found "Facebook could have prevented 10.1 billion estimated views for top-performing pages that repeatedly shared misinformation" ahead of the 2020 US elections had it acted earlier to limit their reach.

"Failure to downgrade the reach of these pages and to limit their ability to advertise in the year before the election meant Facebook allowed them to almost triple their monthly interactions, from 97 million interactions in October 2019 to 277.9 million interactions in October 2020," Avaaz found.

Facebook admits that around 5% of its accounts are fake, a number that hasn't gone down since 2019, according to The New York Times. And MIT Technology Review's Karen Hao reported in March that Facebook still doesn't have a centralized team dedicated to ensuring its AI systems and algorithms reduce the spread of misinformation.

You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article