+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

Ads with 'blatant' election disinformation about the upcoming US midterms got approved by TikTok and Facebook, researchers say

Oct 21, 2022, 23:13 IST
Insider
iStock; Insider
  • TikTok and Facebook approved ads with "blatant" disinformation about the US midterms, a report found.
  • Researchers analyzed Facebook, TikTok, and YouTube's ability "to detect and remove election disinformation."
Advertisement

Social media giants TikTok and Facebook approved ads with "blatant" disinformation about the upcoming US midterm elections, a new report published Friday found.

A recent investigation by the nonprofit Global Witness and the Cyber Security for Democracy team at New York University analyzed Facebook, TikTok, and YouTube's ability "to detect and remove election disinformation" in the lead-up to November's midterms.

The groups found that TikTok — which is owned by Chinese company ByteDance — fared the worst of the three platforms when it came to failing to block the "deceptive" test ads submitted by the researchers.

TikTok approved 90 percent of the ads containing both "misleading and false election disinformation," the report said.

As part of the experiment, the researchers submitted 20 ads targeting battleground states like Arizona, Colorado, and Georgia to TikTok, Meta's Facebook, and Google's YouTube in both English and Spanish. All of the ads that the researchers submitted violated the social media platforms' election ad policies, according to the report.

Advertisement

Though TikTok has banned political advertising, the platform approved nearly all of the ads riddled with falsehoods, including how voting days would be extended and that social media accounts could be used as voter verification, the researchers said.

"TikTok also approved ads that dismiss the integrity of the election, suggest results can be hacked or are already pre-decided, and discourage voters from turning out," according to the researchers.

One ad that TikTok did reject said that voters must be vaccinated against COVID-19 in order to be allowed to vote in the election.

That ad, however, was accepted by Facebook, the groups said.

While Facebook fared better than TikTok, it approved "a significant number of similarly inaccurate and false ads," the researchers said.

Advertisement

YouTube tested the best, according to the researchers, after it "detected and rejected every single such ad submitted and also suspended the channel used to post the test ads.

The researchers said that after the social media platforms notified them that the ads were accepted, the ads were deleted and never published.

"So much of the public conversation about elections happens now on Facebook, YouTube, and TikTok. Disinformation has a major impact on our elections, core to our democratic system," Laura Edelson, the co-director of the Cyber Security for Democracy team said in a statement.

Edelsen added, "YouTube's performance in our experiment demonstrates that detecting damaging election disinformation isn't impossible. But all the platforms we studied should have gotten an 'A' on this assignment."

"We call on Facebook and TikTok to do better: stop bad information about elections before it gets to voters," Edelsen said.

Advertisement

A TikTok spokesperson told Insider that the popular streaming app "is a place for authentic and entertaining content which is why we prohibit and remove election misinformation and paid political advertising from our platform."

"We value feedback from NGOs, academics, and other experts which helps us continually strengthen our processes and policies," the spokesperson said.

A Meta spokesperson pushed back against the report, telling Insider "These reports were based on a very small sample of ads, and are not representative given the number of political ads we review daily across the world."

"Our ads review process has several layers of analysis and detection, both before and after an ad goes live," the spokesperson said. "We invest significant resources to protect elections, from our industry-leading transparency efforts to our enforcement of strict protocols on ads about social issues, elections, or politics – and we will continue to do so."

A Google spokesperson told Insider in a statement on Friday that the company has "developed extensive measures to tackle misinformation on our platforms, including false claims about elections and voting procedures."

Advertisement

In 2021 alone, the spokesperson said, Google blocked or removed more than 3.4 billion ads for violating its policies, including 38 million for violating its representation policy.

You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article