Things haven’t been good for YouTube lately; they’ve been battling all kinds of allegations that claim they’ve not been doing enough to monitor obscene comments and inappropriate content aimed at children on their platform. This, paired with the residual effects of the Adpocalypse, has ultimately led to a loss in major advertisers and YouTube’s credibility has been called into question.
Now, YouTube reports that they will be taking additional steps in moderating inappropriate content. YouTube CEO Susan Wojcicki wrote in a blog post on Monday that YouTube will be increasing its staff by 10,000 workers next year in an attempt to better moderate video content. They will be also conducting regular reports that will make it clear about the number of flags that they receive and the actions that they take to remove the videos and comments that violate their content policies.
Though YouTube recently updated their policies to be more strict on inappropriate content, they clearly didn’t have enough manpower to fully enforce the policies. They are hoping that increasing their moderation staff to 10K will help their situation. BuzzFeed estimates that it will be a 25 percent increase from YouTube’s current staff numbers.
However, while10,000 may seem like a large number at first, if you look at the scope the number of videos being uploaded to YouTube every day, and the crazy amount of traffic the platform gets, a staff of 10,000 workers isn’t really that significant.
It’s highly probable that YouTube’s still going to be relying heavily on their algorithms; Wojcicki wrote that YouTube plans to use machine learning technology to help it “quickly and efficiently remove content that violates our guidelines.” Wojcicki even said herself that 98 percent of the 150,000 violent extremist videos that YouTube’s removed were flagged by their algorithms, and that since June “the technology has reviewed and flagged content that would have taken 180,000 people working 40 hours a week to assess.”
A workforce of 10,000 moderators does look good as a headline, but it seems like YouTube’s main plan in tackling inappropriate content remains using their algorithms for the majority of the work, with some review conducted by human employees.