scorecard
  1. Home
  2. tech
  3. news
  4. YouTube's secret algorithm continues to push misinformation on users, from false election fraud claims to conspiracy theories, according to a new study

YouTube's secret algorithm continues to push misinformation on users, from false election fraud claims to conspiracy theories, according to a new study

Ben Gilbert   

YouTube's secret algorithm continues to push misinformation on users, from false election fraud claims to conspiracy theories, according to a new study
Tech1 min read
  • YouTube's suggestion algorithm continues to promote videos with misinformation, according to a new study.
  • YouTube keeps the suggestion algorithm a secret, preventing researchers from studying how it works.
  • The study, commissioned by Firefox maker Mozilla, collected responses from nearly 40,000 YouTube users.

YouTube is still suggesting videos with misinformation, violent content, and COVID-19 misinformation, according to a major new study published this month.

Notably, this isn't just an issue with YouTube's vast archive of video content not being moderated, but an issue with the suggestion engine at the heart of the video streaming service.

Over 70% of the videos flagged by respondents came through YouTube's suggestion algorithm - an effect that's impossible to study because the algorithm is a closely-guarded secret at Google.

That means YouTube users aren't primarily finding misinformation through search, but through YouTube feeding users those videos.

In one example, a user who was regularly watching videos about wilderness survival, "kept getting recommended extreme right wing channels." A video that was flagged by the user claims to depict a person connected to Rep. Ilhan Omar committing election fraud; claims in that video were debunked as false, yet it remains up on YouTube.

"The goal of our recommendation system is to connect viewers with content they love," a YouTube representative told Insider. "We constantly work to improve the experience on YouTube and over the past year alone, we've launched over 30 different changes to reduce recommendations of harmful content."

YouTube has been repeatedly criticized for flaws with its suggestion algorithm, from suggesting videos with misinformation to perpetuating conspiracy theories. Though some critics have claimed the service is able to outright radicalize users, the suggestion algorithm is far more adept at compounding previously held ideologies, studies show.

YouTube representatives told Insider that the company is looking into bringing in external researchers to study how its suggestion algorithm works, but no specific plans have been announced.

Got a tip? Contact Insider senior correspondent Ben Gilbert via email (bgilbert@insider.com), or Twitter DM (@realbengilbert). We can keep sources anonymous. Use a non-work device to reach out. PR pitches by email only, please.

READ MORE ARTICLES ON


Advertisement

Advertisement