YouTube will no longer recommend so-called conspiracy videos, such as those claiming that the Earth is flat or that deny major historical events, such as the September 11 terror attacks or the Holocaust, NBC News reports. The company announced in a blog post that these types of videos, in addition to those espousing medically inaccurate or otherwise blatantly false information, would no longer appear as suggested additional viewing at the conclusion of other YouTube videos.
Currently, if a user watches a video on a given topic, like personal fitness, following the video will be a series of recommended additional videos for similar viewing. This has long been the case with conspiracy-oriented content as well, which could create a veritable rabbit hole of false, misleading, and often downright crazy additional content.
YouTube has called out these types of videos as coming “close to” violating the platform’s community guidelines, meaning that the videos will begin being handled differently, though allowed to remain on the platform since they are not breaking any site-wide rules.
Former Google engineer Guillaume Chaslot, who helped to create the programming that drives suggested videos on YouTube, celebrated the decision.
“It’s only the beginning of a more humane technology. Technology that empowers all of us, instead of deceiving the most vulnerable,” he wrote in a series of tweets on the topic.
YouTube has announced that it will no longer recommend videos that “come close to” violating its community guidelines, such as conspiracy or medically inaccurate videos: https://t.co/XVb85IRNDI pic.twitter.com/aQqpZ7kvPj
— NBC10 Philadelphia (@NBCPhiladelphia) February 11, 2019
Chaslot also shed some light on the reasoning behind YouTube’s approach to recommended videos, indicating that the goal of the technology is simple: to keep users on the site and watching additional videos for as long as possible, maximizing the amount of ads that can be served, and thus revenue brought in through the platform.
“While this shift will apply to less than one percent of the content on YouTube, we believe that limiting the recommendation of these types of videos will mean a better experience for the YouTube community,” the company said in a blog post announcing the change. “To be clear, this will only affect recommendations of what videos to watch, not whether a video is available on YouTube.”
So the affected videos will remain on the site, even as they “come close to” violation of the community guidelines. Creators of such videos will simply have to rely on users actively searching out their content, rather than benefiting from the recommended videos funneling to them users who have recently watched similar content.
At this time, the change will apply to a small subset of videos in the United States, with further programming and analysis taking the thinking to other markets in the future.