scorecard
  1. Home
  2. tech
  3. Facebook has vowed to 'do more' to stop spreading misinformation

Facebook has vowed to 'do more' to stop spreading misinformation

Julie Bort   

Facebook has vowed to 'do more' to stop spreading misinformation

Mark Zuckerberg

Justin Sullivan/Getty Images

Facebook CEO Mark Zuckerberg

Some people believe Facebook played a dangerous role in the presidential election by feeding too many people fake news that validated their choice of candidate while vilifying the opponent.

Facebook and its News Feed algorithm is tuned to try and show you stuff you like. It doesn't necessarily distinguish between fact and fiction. In fact, just last week Buzzfeed documented how Trump supporters were being duped with an onslaught of fake news generated by more than 100 pro-Trump websites run from a single town in Eastern Europe.

This has been called the "filter bubble," meaning that after awhile, all you see is stuff that you are likely to agree with, and will not see news and other items that may alter, or even just inform, your worldview.

Facebook has been resisting this idea that it has a responsibility to present balanced news in its News Feed as if it was a news organization or a media company.

But, facing criticisms over the "filter bubble" and the part it may have played in the election, Adam Mosseri, VP of product management at Facebook, repeated to TechCrunch's Natasha Lomas a previous statement that Facebook is trying to do a better job of filtering out the fake stuff. He said he knows "there's so much more we need to do."

Mosseri first gave this statement prior to the election but it seems, perhaps, even more relevant now.

Facebook clearly doesn't want to become known as the platform of propaganda.

Here's the full statement from Mosseri (emphasis ours):

We take misinformation on Facebook very seriously. We value authentic communication, and hear consistently from those who use Facebook that they prefer not to see misinformation. In Newsfeed we use various signals based on community feedback to determine which posts are likely to contain inaccurate information, and reduce their distribution. In Trending we look at a variety of signals to help make sure the topics being shown are reflective of real-world events, and take additional steps to prevent false or misleading content from appearing. Despite these efforts we understand there's so much more we need to do, and that is why it's important that we keep improving our ability to detect misinformation. We're committed to continuing to work on this issue and improve the experiences on our platform.

Interestingly, Facebook's admission that it could do more to inform people, instead of misinform them, comes as Facebook CEO Mark Zuckerberg posted a wistful note about the election, mentioning his baby daughter Max (emphasis ours).

This work is bigger than any presidency and progress does not move in a straight line. The most important opportunities of Max's generation -- like curing all disease, improving education, connecting everyone and promoting equal opportunity -- will take long term focus and finding new ways for all of us to work together, sometimes over decades. We are all blessed to have the ability to make the world better, and we have the responsibility to do it. Let's go work even harder.

NOW WATCH: Elon Musk just unveiled something that could revolutionize how you power your home

READ MORE ARTICLES ON



Popular Right Now



Advertisement