YouTube’s algorithm has come under constant criticisms for often recommending conspiracy videos. YouTube is now saying that this will finally stop.
Recently, YouTube’s come out and said that its algorithm will no longer suggest “borderline” videos that are close to violating community guidelines or videos that “misinform users in a harmful way” — for example, a video about miracle cures for life-threatening illnesses.
How many videos will be affected?
Apparently, the change will affect less than 1 percent of the videos on the platform, according to YouTube. But when you think of the amount of content on YouTube, 1 percent is probably millions of videos.
“We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users,” YouTube said in a blog post.
How is YouTube going to stop recommending conspiracy videos?
While YouTube has hired more and more human moderators to oversee videos, their algorithm will be the one deciding what videos will be appearing in the recommended videos section. humans will train the AI, however.
This decision to keep the algorithm as the main force behind the recommended videos leaves a lot to question. It may leave many wondering what, if anything, will change, because the algorithm was the source problem from the start. We’ll just have to see what YouTube does. The new policy will be integrated gradually. It’ll start with a small number of videos in the US. It’ll then work its way out to the entire platform worldwide when the algorithm becomes more refined.
It’s also important to note you’ll still see the borderline videos if you are subscribed to the channel that publishes that content.