scorecard
  1. Home
  2. tech
  3. enterprise
  4. news
  5. Foreign actors and extremist groups are using disinformation on Twitter and other social networks to further inflame the protests across America, experts say

Foreign actors and extremist groups are using disinformation on Twitter and other social networks to further inflame the protests across America, experts say

Jeff Elder   

Foreign actors and extremist groups are using disinformation on Twitter and other social networks to further inflame the protests across America, experts say
Tech6 min read
  • Experts say Americans are being manipulated on social media by fake posts about unrest across the nation.
  • A former chief information officer of the White House says groups use fake photos and videos to manipulate Americans into emotional responses.
  • Researchers say social media following the death of George Floyd is similar to a social media campaign from a Russian state group that manipulated the online discourse around the Black Lives Matter movement.
  • Twitter may be more subject to misleading posts than its peers, experts say. The social network has recently started labeling misleading posts.
  • Twitter says "When we identify information operation campaigns that we can reliably attribute to state-backed activity — either domestic or foreign-led — we will disclose them to the public."

Disinformation seeding discord on social media – some spread by foreign actors – is adding to a confusing whirlwind of news coverage of unrest across the nation, researchers and experts say.

Media and citizen journalists are posting video, images, and accounts of scattered and chaotic protest events in response to the killing of George Floyd by a Minneapolis police officer, and the posts are being reshared broadly. The result is an often overwhelming stream of media from multiple sites and sources, and experts say audiences must be aware that the situation is being manipulated.

"People need to be aware that these events on the ground are being spun for political reasons," says Angie Drobnic Holan, editor-in-chief of PolitiFact, the Pulitzer Prize-winning fact-checking news service of the Poynter Institute journalism think tank.

Much of that spin likely comes from forces outside of America, the experts warn. "Were there foreign-backed disinformation accounts targeting Americans this weekend? Absolutely. I am positive that was happening," says Molly McKew, a writer and lecturer on Russian influence who advises the non-profit political group Stand Up America.

In fact, University of Washington researchers who found Russian influence in the discourse around the Black Lives Matters civil rights movement in 2016 tell Business Insider that they see the same signs from over the weekend.

"There are definitely a number of information operations at play here constructing fake personas and parties that are presenting themselves as activists that may well be state-sponsored agencies," says researcher Ahmer Arif, a PhD candidate at the University of Washington.

"I'm really worried about infiltration of [online] activist movements by domestic provocateurs and foreign agents — who will try to shape these movements towards their own objectives," another member of the research team, University of Washington professor and researcher Kate Starbird, tweeted Saturday.

Disinformation is sometimes dismissed as just the work of automated "bots" or troll accounts mocking people, but the impact can be far greater, experts say.

"It is something people should take seriously," says Theresa Payton, former White House chief information officer under President George W. Bush and author of a new book about online disinformation. "Groups that conduct disinformation and manipulation either want you to react with an emotion or don't want you to take an action."

As an example, Payton cited a photo circulated over the weekend showing a large explosion next to the Washington Monument. It was tweeted as a photo from the weekend's protests, in which fires did burn in different parts of Washington, DC, and other cities. But this dramatic photo was in fact from the fictional television show "Designated Survivor."

"People might see that photo and think, 'That's awful. I don't support that,' And not be an ally to peaceful civil rights movements. That would be a real shame," Payton says.

'Hashtags can really be gamed'

Experts say Twitter may be especially vulnerable amid the scattered, nationwide unrest surrounding the death of George Floyd because users often follow multiple accounts for news on Twitter, and posts come at a rapid pace. Users may also be more likely to follow accounts from people they do not know, the experts say. Twitter hashtags can also spread misinformation quickly.

"Hashtags can really be gamed," says independent researcher Darius Kazemi, who has studied the influence of bots, or automated accounts, that break Twitter's rules to spread misinformation. "Someone trying to sew chaos and create noise on both sides can get a hashtag going and pull real, well-intentioned people into the spread of misleading videos and images, or bogus news accounts." Once human users share misinformation, bots can amplify it a great deal, he says.

On Monday, for example, accounts on both sides of the political aisle were spreading the hashtag #WHEREARETHEPROTESTERS, Kazemi said, advancing different narratives. Multiple accounts were tweeting identical messages, he said, suggesting those users were either hacked or bots.

A Twitter spokesperson said: "Investigations are always ongoing. When we identify information operation campaigns that we can reliably attribute to state-backed activity — either domestic or foreign-led — we will disclose them to the public."

A Russian troll once even tricked Twitter's CEO

Foreign actors have successfully influenced social media at high levels in previous civil rights struggles. Twitter chief executive Jack Dorsey retweeted a Russian troll posing as a civil rights activist 17 times from 2016 to 2017, Wall Street Journal researchers found. In 2017, Twitter released a list of 2,752 affiliated troll accounts from a Russian government agency.

Twitter outlined its approach to synthetic and manipulated media with a new rule in February: "You may not deceptively share synthetic or manipulated media that are likely to cause harm. In addition, we may label Tweets containing synthetic and manipulated media to help people understand the media's authenticity and to provide additional context."

Since then, it has labeled some tweets as misleading — and courted a political firestorm in doing so, after placing fact-checks next to two tweets from President Donald Trump about vote-by-mail policy in late May.

Other social media platforms, such as Facebook, Instagram, and YouTube also are manipulated with misinformation, experts say. The New York Times reported that a Russian government troll farm ran a Facebook page from 2014 to 2017 that surpassed the followers on the verified Black Lives Matter Facebook account.

Be careful what you share

In 2016 protest discussions, the Washington research team found 3% of tweets about Black Lives Matter were from Russian government agency-linked accounts.

That work showed "disinformation campaigns can play 'both sides'" and "prompt certain strong emotions." The researchers see the same patterns happening now. Arif says the Washington team likens restraint from resharing questionable social media information to washing your hands to prevent spreading a virus.

Disinformation is a commodity bought and sold, according to new research from Japanese cybersecurity firm Trend Micro. The company found that distribution of disinformation is propelled by thriving dark web marketplaces.

"Fake news and cyber-propaganda services offered in these underground spaces involve the exploitation of social networks; typically used to advertise or push a certain message or agenda," the research released last week found. "Cybercriminals generally use autonomous bots, real people, or crowdsourcing programs to manipulate social media platforms. The Russian underground maintains the lowest-priced fake news services among the other forums, and prices have remained steady since 2017."

Experts' tips on not spreading disinformation

The Washington researchers, Politifact, McKew, and Kazemi suggest these tips to prevent from spreading misinformation:

  • Rely on sources you already trust.
  • Be careful of information tagged with trending hashtags.
  • Vet your sources. Don't retweet someone you don't know.
  • Don't follow new accounts without closely inspecting them.
  • Look to see if an account is new, and if the handle matches the content.
  • If you make a mistake, correct it. Let your followers know that you got something wrong.
  • Remember that accounts seeking to intentionally misinform us might not visibly be on the "other side" of political divides. Disinformation often targets users who might agree with it and spread it.
  • Videos can easily be taken out of context. Be careful not to automatically buy in.
  • Tune into how anxiety and uncertainty is shaping your behavior on social media.

The last point can be crucial, the experts said. "The concern to me at this point is that our own social media psychological well-being is so bad that we are quite vulnerable to manipulation," McKew said.

The need to make sense of a confusing and upsetting time can rush us into unfounded opinions, Politifact's chief, Holan, says. "When we have multiple protests unfolding in multiple cities, you have to be really careful about drawing conclusions. It may take a couple of days to make sense of things. There is an understandable urgency to make sense of things, but that is just not always possible in the heat of the moment."

Read the original article on Business Insider

READ MORE ARTICLES ON


Advertisement

Advertisement