It’s safe to say that we’ve all fallen victim to YouTube’s algorithm. As you probably already know, YouTube's algorithm dictates the videos YouTube recommends and what shows up in search results. The algorithm itself is pretty complex, but Guillaume Chaslot, a former engineer from YouTube, has given us a little insight into what videos the algorithm likes to promote with common searches.
When employed at YouTube, Chaslot was on the the programmers working on YouTube’s recommendations. He was fired by Google in 2013 for trying to give users more control over the algorithms; he confirmed this through a post on Twitter. In 2016, after being fired, Chaslot began tracking YouTube’s recommendations, believing that YouTube algorithms lead users down a path they wouldn’t have taken on their own.
To Chaslot, this is a huge problem. When it comes to the inner workings of YouTube’s algorithms, users are left in the dark. Sure, they can see things like the number of likes, dislikes or views a video has, but they can't see how often it’s recommended and where it's being recommended. It opens the door for YouTube to manipulate which videos are most visible. To combat this, Chaslot has built a website called AlgoTransparency. The site displays the top recommended videos for popular topics. You can search for topics like mass shootings, recent elections or science to see which videos are being reccomended most frequently for each search term.
Chaslot has found the algorithm often recommends conspiracy-theory, anti-science, or anti-media videos. He also found that the algorithm prefers videos that feature politicians that talk in aggressive, bullying ways.
The site additionally displays the most mentioned terms in the top recommended videos. If you look up “NASA” on YouTube, the most frequent terms mentioned in the top videos are “NASA,” “Alien,” “Secret” and “UFO.”
If you take some time to look around the site, you’ll begin to see some strange results. For instance, there’s a number of videos from Jimmy Kimmel Live that pop up when you search “vaccines facts.” It’s also concerning that a lot of anti-vaxxer videos from conspiracy theorists show up for “vaccine facts,” as well.
Chaslot hopes that AlgoTransparency will help inform YouTube users about where its algorithms are leading them. He told MIT Technology Review, “Everybody should know, if you start to spend time on YouTube, where it’s going to take you.” Visit algotransparency.org to see for yourself.