+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

Facebook could solve the controversy over 'censoring' conservative news with one of newspapers' oldest features

May 10, 2016, 15:56 IST

Advertisement
Facebook founder and CEO Mark Zuckerberg speaks during the 2013 TechCrunch Disrupt conference on September 11, 2013 in San Francisco, California. The TechCruch Disrupt Conference runs through September 11.Justin Sullivan/Getty Images

Facebook has found itself in the middle of a firestorm of criticism over allegations it is suppressing conservative news.

Reports from Gizmodo alleged that the curators who control the "Trending" bar on the side of the social network routinely choose not to feature conservative topics and news outlets, while favouring others like activist movement Black Lives Matter.

In a Facebook post published early Tuesday morning, the social network's VP of search Dan Stocky finally denied the reports, writing that "we take these reports extremely seriously, and have found no evidence that the anonymous allegations are true." (Scroll down for the full statement.)

Whatever the truth of the allegations over conservative suppression, the incident points to a broader truth - that Facebook is becoming more and more like a traditional media company, and needs to embrace the thorny issues that come with the territory.

There may not be guidelines telling curators not to select certain topics, but people have implicit biases. It's natural, and to an extent unsurprising. Curators will necessarily pick topics which they know about, and frame them in a way that is reflective of their mindset. The Gizmodo article discusses exactly this (emphasis ours):

Advertisement

"Depending on who was on shift, things would be blacklisted or trending," said the former curator. This individual asked to remain anonymous, citing fear of retribution from the company. The former curator is politically conservative, one of a very small handful of curators with such views on the trending team. "I'd come on shift and I'd discover that CPAC or Mitt Romney or Glenn Beck or popular conservative topics wouldn't be trending because either the curator didn't recognize the news topic or it was like they had a bias against Ted Cruz."

This is precisely what happens in any media organisation: Writers and editors gravitate towards subjects of interest - and the combined weight of these biases give the organisations their (deliberate or otherwise) slant, from the left-leaning Guardian to the populist Daily Mail.

The big difference is that Facebook doesn't frame itself as a media company - and generally tries to avoid the necessary debates about bias and slant that media companies have to tackle. Facebook's Trending section is presented as a neutral reflection of organically popular stories on the site: I'd wager that many Facebook users don't even realise there are human curators.

People warm themselves as they demonstrate since the Nov. 10 shooting of 24-year-old Jamar Clark, in front of the Minneapolis Police 4th Precinct on Tuesday, Nov. 24, 2015.Jeff Wheeler/Star Tribune via AP

But as Mathew Ingram writes over at Fortune, these problems are now catching up with the social network. "Facebook routinely says that it doesn't see itself as a media entity, and doesn't see its algorithmic choices as being of any concern to anyone outside the company-even when those choices help influence the way people think and behave, like whether they decide to vote and how they see political issues ... At some point, however, Facebook is going to have to grapple with these kinds of issues, or at least acknowledge that they exist and that people have a right to be concerned about them."

One potential remedy is to do what newspapers and media organisations have done for decades - use bylines and mastheads.

Advertisement

The vast, vast majority of the media business credits authors and producers on stories, as well as the organisation's editorial leadership. This allows for public accountability. Sure, the writer might be biased - and many writers embrace their biases, arguing against the falseness of forced neutrality - but the byline means there is editorial accountability.

If Facebook adopted an optional masthead that showed who has curated topics, and which editors were online, it wouldn't solve the problem of unconscious biases. But it would alert users to them, allowing them to be better informed. It would be an acknowledgement that yes, the social network is grappling with these issues, and it's owning up to that.

However, to do so would require Facebook embracing its role and responsibility as a dominant media organisation to a degree that is has previously shown no inclination to do. "The thing that's important to remember is we can't start an editorialized feed," Facebook News Feed director Adam Mosseri told Time last year. "That's an editorial point of view the way a newspaper has, so we can't do that."

And even if it did, it might only be a temporary fix: Almost all of the former Facebook curators Gizmodo talked to told the tech site that they believed "they were there not to work, but to serve as training modules for Facebook's algorithm."

In short: It's only a matter of time until the curators are completely replaced by opaque algorithms - and we are once again none the wiser as to how the most powerful media company in the world decides what to show us.

Advertisement

Here's Facebook VP of Search Dan Stocky's full statement:

NOW WATCH: Physicists came up with a simple way you can outperform supercomputers at quantum physics

Please enable Javascript to watch this video
You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article