Facebook employees worried an algorithm change in the middle of Trump's presidency would push sensationalistic and divisive content, a new report says
- Facebook employees knew that an algorithm change in 2018 would elevate false and divisive content.
- The company graded posts to decide what to prioritize in users' News Feeds
- Employees said it was "an increasing liability," and Zuckerberg wasn't always open to broadly fixing it.
Facebook employees were well aware that a change to the platform's algorithm could elevate political divisiveness and outrage, according to company documents viewed by the Wall Street Journal.
The internal memos show Facebook made the change because people were using the platform less, the paper reported. The social network wanted to promote posts with the most engagement to drive "meaningful social interactions," or MSI, a metric it uses to measure how heavily people engage with posts.
Facebook developed a system to grade a post, a grade that would influence how much the platform would promote it. A "like" equaled one point, while reactions - including angry emojis that would sprout up on stories about controversial topics - accounted for five points, according to the Journal.
There was an upside: people viewed their close connections' posts more frequently and considered them more meaningful and trustworthy. But the change wasn't without adverse side effects: it prioritized content that was violent, toxic, false, politically divisive, and all-around outrageous, according to the report.
It especially slammed news publishers, who were forced to reorient their business strategies to reach the platform's readers, who were more prone to clicking on or interacting with sensationalistic content over other forms of stories, like ones about self-care.
Facebook disputed the Journal's characterization of the ranking system.
"Is a ranking change the source of the world's divisions? No," a spokesperson told Insider. "Research shows certain partisan divisions in our society have been growing for many decades, long before platforms like Facebook even existed. It also shows that meaningful engagement with friends and family on our platform is better for people's well-being than the alternative."
The spokesperson declined to say if the system is still in effect.
The effect of Facebook's algorithm on news outlets and the spread of so-called clickbait articles, which are written specifically to hook readers' attention, has been discussed for years in Silicon Valley and media circles. But The Journal reports that Facebook employees were worried the algorithm change amplified the angriest, most vocal voices online, the memos showed.
"Our approach has had unhealthy side effects on important slices of public content, such as politics and news," Facebook data scientists wrote in a document, per The Journal.
"This is an increasing liability," read a separate memo, according to the paper.
CEO Mark Zuckerberg wasn't always open to implementing proposed solutions broadly across the platform. He said no to a fix that could have helped reduce false information across all topics on the platform because it could have caused people to spend less time on the platform, per the Journal.
Facebook's algorithm and the role it plays in false and polarizing content online has taken center stage in recent years. The company has rolled out new changes as a result, especially after the 2020 presidential election, the pandemic, and the deadly insurrection at the US Capitol on January 6.
Facebook said in August that it would reduce the volume of political posts it puts in front of users after surveying people online, who felt "that there's too much political content in their News Feeds," as Axios reported.
The new process relies less on Facebook's algorithm that determines how likely someone is to share or comment on a certain post based on their past engagement. It could impact news publishers that produce politics-centric content.