+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

Facebook's confession about a bug that affected 14 million people signals a seismic shift inside the company

Jun 8, 2018, 15:19 IST

Facebook CEO Mark Zuckerberg.Getty

Advertisement
  • Facebook admitted on Thursday that a bug resulted in people's status updates defaulting to public, even if they had been set to private.
  • This is one of the few times Facebook has proactively disclosed a privacy issue that affects millions of people.
  • It signals a change of approach following the humbling Cambridge Analytica scandal.
  • Silicon Valley firms shining a light into their own black boxes should be encouraged.


Facebook is finally becoming more transparent about bugs and leaks that affect user data.

The company has proactively confessed to a bug that resulted in setting people's status updates to public by default, even if they had previously been set to private previously. The flaw meant some 14 million people posted public updates without realising they were visible to the entire internet.

Obviously, this is not a good thing. Trust in Facebook is low thanks to years of privacy scandals and, most recently, the Cambridge Analytica saga. Even a comparatively small bug chips away at what trust remains.

But this is one of the only times Facebook has confessed to a privacy issue before journalists or researchers have found it.

Advertisement

This year alone, journalists were responsible for uncovering the Cambridge Analytica scandal, that some third-party Facebook apps leaked user information, and that Facebook had extensive data-sharing partnerships with handset makers. While investigating the Cambridge Analytica scandal, Facebook did proactively confess that malicious actors could scrape user profile information, a bug which affected most of its 2 billion users.

This is all part of a broader trend of news outlets highlighting how bad tech companies are at policing their own platforms, usually by finding and flagging bad content. The Times newspaper in the UK highlighted, for example, how brands were accidentally funding terror content on YouTube. Business Insider discovered that YouTube Kids was suggesting conspiracy theory videos to its young viewers. Reporters even have a nickname for this: Content moderation journalism.

Facebook has evidently decided to get out in front of the stories for once. In an apologetic post about the privacy bug, the company wrote: "We've heard loud and clear that we need to be more transparent about how we build our products and how those products use your data - including when things go wrong. And that is what we are doing here."

Facebook CEO Mark Zuckerberg arrives to testify before a Senate Judiciary and Commerce Committees joint hearing regarding the company's use and protection of user data, on Capitol Hill in Washington, U.S., April 10, 2018.Aaron P. Bernstein/Reuters

This is the year where Mark Zuckerberg has appeared before US Congress and European Parliament members to explain the impact of his platform on democratic processes. Up until now, lawmakers have perceived that he doesn't understand the immense power and responsibility of Facebook.

Advertisement

And CTO Mike Schroepfer said the Cambridge Analytica scandal caused the biggest cultural shift in a decade inside Facebook. Clearly, the company has had a rethink about what it should be disclosing and when.

Not everyone will be convinced by this change of heart and, clearly, it's come about as a result of media, public, and political pressure. It's a positive step though, and one that other technology firms should adopt. Google, for example, has tripped up on policing its own platform, with The Times and Business Insider recently finding that the autocomplete feature exposed the names of victims in high-profile sexual assault cases.

If Facebook begins to shine a torch into the black box of algorithms and data, it might be the first step to rebuild public trust in Silicon Valley.

NOW WATCH: This $530 Android phone is half the price of an iPhone X and just as good

You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article