+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

YouTube says it will soon recommend fewer conspiracy theory videos on its platform

Jan 26, 2019, 03:31 IST

Silhouettes of users are seen next to a screen projection of the Youtube logo in this picture illustration taken March 28, 2018.Dado Ruvic/Reuters

Advertisement
  • YouTube announced in a company blog post on Friday that it would recommend less "borderline" content, or videos that are untruthful in potentially harmful ways.
  • Examples of videos YouTube hopes to promote less often include ones that claim that the Earth is flat or promote phony cures for serious illnesses.
  • "We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users," YouTube said in its blog post.
  • YouTube has long struggled with its recommendations algorithm, catching backlash for promoting conspiracy theories and leading users to more extreme corners of the Internet.

The Earth is not flat and soon, you should start seeing fewer videos on YouTube that say that it is.

On Friday, YouTube announced in a company blog post that it would recommend less "borderline" content, or videos that are untruthful in potentially harmful ways.

Essentially, YouTube, which is owned by Google, thinks it has created a better solution for stopping the spread of conspiracy theory videos on its platform.

Examples of videos YouTube hopes to promote less often include the Earth is flat claim, as well as those that promote phony cures for serious illnesses or make blatantly false claims about historical events like 9/11.

Advertisement

Many of these "borderline" videos don't necessarily violate YouTube's Community Guidelines, but the company says that limiting their reach will provide a better experience for its users. "We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users," YouTube said in its blog post.

These videos will not be removed entirely from the platform, and they may still appear in search results or recommendations if a user follows certain channels, the company explained.

YouTube also provided a bit of insight into how its recommendation model works, which involves "human evaluators and experts from all over the US" reviewing videos and using that feedback to train its machine learning systems.

YouTube has long struggled with its recommendations algorithm, catching backlash for promoting conspiracy theories and facing criticism for leading its users to more extreme corners of the Internet.

Read more: One viral thread shows how quickly YouTube steers people to wacko conspiracy theories and false information

Advertisement

"It's just another step in an ongoing process," the company said in its blog post on Friday. "But it reflects our commitment and sense of responsibility to improve the recommendations experience on YouTube."

NOW WATCH: Apple forever changed the biggest tech event of the year by not showing up

You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article