+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

Facebook had a very unsuccessful week in its fight against misinformation and hate speech

Aug 15, 2020, 19:53 IST
Business Insider
Facebook's executives have repeatedly pledged to do better about cleaning up the platform, but the company has a long way to go.Drew Angerer/Getty Images; The Asahi Shimbun/Getty Images
  • Facebook had a bad week when it came to cleaning up toxic content on its platform.
  • The Wall Street Journal reported that Facebook refused to take action against a politician who violated its hate speech policies for fear of backlash, just months Trump's controversial posts suggesting violence against protesters.
  • Another report from NBC News found that Facebook has discovered a community of millions of QAnon conspiracy theorists on its platform, and according to a report from Tech Transparency Report failed to deliver on its pledge to crack down on violent boogaloo hate groups.
  • The company also got slammed for a loophole in its fact-checking policies that allow climate change skeptics to spread falsehoods by labeling them as "opinion," The Verge reported.
  • Amid a scathing civil rights audit, grilling from lawmakers, and major advertiser boycott, Facebook has promised to "get better" at tackling hate speech and misinformation, but this week's missteps show that the company still has a long way to go.
Advertisement

In early July, Facebook executives including CEO Mark Zuckerberg, COO Sheryl Sandberg, and chief product officer Chris Cox geared up for a meeting with civil rights leaders who were fed up with what they called the company's failure to curb hate speech and misinformation on its platform.

The groups had organized an unprecedented advertiser boycott over the issue and did not mince words in their criticisms of the social media giant.

"We have been continually disappointed and stunned by Mark Zuckerberg's commitment to protecting white supremacy, voter suppression and outright lies on Facebook," Color of Change president Rashad Robinson said in a press release at the time.

The call to boycott Facebook came just weeks after a series of controversial posts by President Donald Trump where he suggested violence against anti-racism protesters and spread false claims about mail-in voting. Facebook said neither post violated its policies.

Despite the blowback, the company insisted its policies were fine, it just needed to step up enforcement.

Advertisement

"We have clear policies against hate — and we strive constantly to get better and faster at enforcing them," Sandberg wrote in a Facebook post ahead of the meeting with civil rights groups. "We have made real progress over the years, but this work is never finished and we know what a big responsibility Facebook has to get better at finding and removing hateful content."

This was far from the first time the company had pledged to "get better," and civil rights groups emerged from the meeting unconvinced this time would be any different, saying Facebook "is not yet ready to address the vitriolic hate on their platform."

A day later, Facebook released its first-ever civil rights audit, which slammed the company over its refusal to moderate political speech. Sandberg offered a lukewarm commitment to implement some, but not all, of the auditors' proposed changes.

Facebook gave critics some brief glimmers of hope in the following weeks. It announced tweaks to how it labels posts from politicians that violate its hate speech policies, added a label to a Trump post making false claims about mail-in voting, took down his post containing COVID-19 misinformation, and shut down accounts associated with violent hate groups and conspiracy theorists.

But this week's news shattered any illusion that Facebook had made meaningful progress toward its purported goal of cleaning up the platform.

Advertisement

Facebook did not respond to a request for comment on this story.

Facebook (again) let politicians break its rules for fear of political backlash

The Wall Street Journal reported Friday that Facebook refused to apply its hate speech policies to T. Raja Singh, a politician from India's ruling party, despite his calls to shoot Muslim immigrants and threats to destroy mosques.

Facebook employees had concluded that, in addition to violating the company's policies, Singh's rhetoric in the real world was dangerous enough to merit kicking him off the platform entirely, according to the report, but the company's top public policy executive in India overruled them, arguing that the political repercussions could hurt Facebook's business interests in the country (its largest market globally).

Facebook has faced similar criticism in the US, where employees have complained that Facebook allows Trump and other conservatives to consistently bend its rules and doesn't take action because it fears political backlash.

Conspiracy theory groups are still flourishing on Facebook

On Monday, NBC News got a sneak peek at an internal Facebook investigation showing that thousands of groups and pages affiliated with the QAnon conspiracy theory have spread across its platform and attracted millions of followers.

Advertisement

NBC News reported that Facebook has been crucial to QAnon's growth because of its emphasis on groups, which its algorithm recommends to users based on their previous interests. Facebook executives even knew that the algorithm was pushing people to more radical positions, yet they shut down efforts to fix it, according to The Wall Street Journal.

Just months earlier, Facebook boasted that it had removed 11 QAnon accounts for using fake profiles to amplify their reach. But the investigation reported by NBC News reveals that Facebook has only chipped away at a tiny fraction of the conspiracies running rampant on its platform.

Violent extremists are evading Facebook's crackdowns

An analysis Wednesday from nonprofit group Tech Transparency Project said that "boogaloos" — violent anti-government extremists who advocate for a second Civil War and often espouse white supremacist views — were escaping Facebook's efforts to force them off the platform.

In June, Facebook said it banned hundreds of boogaloo-affiliated accounts, groups, and pages, and designated it as a "dangerous organization." But TTP's review found that Facebook's "slow and ineffective response has allowed the movement to persist on its platform."

More than 100 new groups popped up since Facebook's announcement, and others simply changed their name to avoid the crackdown, according to TTP, in a sign that boogaloos' tactics are evolving faster than Facebook can snuff out offenders.

Advertisement

Facebook fact-checking loophole lets climate change skeptics pass off falsehoods as opinion

Critics have long accused Facebook's third-party fact-checking program of lacking real teeth or enough resources to effectively fight back against misinformation.

One prominent example has been its policy exempting "opinion" pieces from fact-checks, which drew scrutiny last fall when Facebook overruled one of its fact-checkers. The fact-checker had determined that a post expressing doubt about climate change had cherry-picked data and labeled it "false," but after some Republicans alleged bias, Facebook removed the label, saying it was actually an opinion article.

Democratic lawmakers called on the company to close the loophole, but on Thursday, The Verge reported that Facebook is refusing to budge. Spokesperson Andy Stone told The New York Times last month that the company has bigger priorities, like coronavirus misinformation.

Unfortunately, it's not even clear Facebook can tackle that. Just two weeks after Stone's comments to The Times, Facebook took down a conspiracy theory video about the pandemic — but not before more than 14 million people had viewed it.

You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article