In a published on Friday, YouTube said it would be making changes to its recommendations algorithm to explicitly deal with conspiracy theory videos. The company says the update will reduce the suggestion of “borderline content and content that could misinform users in harmful ways.”
YouTube clarified what kind of videos fit that description by providing three examples: “videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.”
The company clarified that this content doesn’t necessarily violate its . This means that while the content may still exist on YouTube, the site’s algorithm will omit these videos from being recommended to its users.
In order to deal with this sort of problematic content, YouTube says it relies on “a combination of machine learning and real people.” Human evaluators and experts will train the recommendation system to evaluate these videos. At first, the changes will only be visible on a small number of videos in the U.S.
YouTube says that overall less than 1 percent of videos will be affected by this change. But, with the platform’s massive video archive and of new content being uploaded per minute, that still amounts to a lot of videos.
Our IP Address: