+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

Facebook conducted a study to see if people's news feeds showed them opposing viewpoints - and the results are not encouraging

May 10, 2015, 01:19 IST

Facebook data scientists published a study in Science magazine this week saying that the social network doesn't completely isolate its users from different political viewpoints - but that even if it did, there might not be anything wrong with that.

Advertisement

The issue at hand is whether or not Facebook's filtering algorithm keeps people from reading news stories and opinions they disagree with, since it's designed to mostly show you things you'd like. This matters because exposure to different political ideas is important to the concept of democracy. After all, you can't make a smart decision without considering all sides.

But if the Facebook News Feed is automatically bumping stuff you might not agree with down the page, where this same study shows it's far less likely to be clicked on, there are entire worlds of perspective you could miss out on.

The study finds that on average, around 23% of a user's friends will be of a different political affiliation, and an average of 29% of News Feed stories will present a user with an opposing viewpoint. 

Meanwhile, Facebook says that the filtering algorithim itself is more affected by which stories a user clicks on than by anything Facebook itself does. 

Advertisement

In other words, it's bad, but not as bad as Facebook's worst critics fear. 

"This shows that the effects that I wrote about exist and are significant, but they're smaller than I would have guessed," Eytan Bakshy, the data scientist who led the study, told the New York Times. 

In the study, Facebook refers to displaying a story that a user won't necessarily agree with as "cross-cutting." 

The baffling part is that Facebook's study says "we do not pass judgment on the normative value of cross-cutting exposure," and that "exposure to cross-cutting viewpoints is associated with lower levels of political participation."

Microsoft Research scientist Christian Sandvig has a problem with this approach.  

Advertisement

"So the authors present reduced exposure to diverse news as a 'could be good, could be bad' but that's just not fair. It's just 'bad.' There is no gang of political scientists arguing against exposure to diverse news sources," Sandvig writes in a blog entry

There's also a major flaw in the study. Of Facebook's 200 million American users, the research team opted to study just 10.1 million of them. That's only 20%. And of that 20%, only 9% of them listed their political affiliation

Which, after crunching the numbers, means that Facebook opted to only study 4% of American Facebook users.

The big problem with this is that it means the sampling wasn't random. Facebook chose from a base of people who feel strongly enough about politics to list it on their profile page, which isn't representative of Facebook's population at large. 

Advertisement

"People who self-identify their politics are almost certainly going to behave quite differently, on average, than people who do not, when it comes to the behavior in question which is sharing and clicking through ideologically challenging content," wrote University of North Carolina Assistant Professor Zeynep Tufekci in a Medium post on the study.

Even with all of these issues and strange philosophical bents, it's one of the first studies of its kind, and will prove invaluable to researchers going forward. 

You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article