YouTube logo over many gears

YouTube has just released its first ever community guidelines enforcement report, and it reveals some significant information. According to the report, YouTube removed 8.3 million videos between October and December last year.

The platform has been under some pretty harsh criticism in the past year, namely for its inability to spot and remove inappropriate videos disguised as children-friendly content. So, YouTube wants everyone to know that they’re doing something to combat all the criticism. The company said that the reason for releasing this report is to “show the progress [it’s] making in removing violative content from [its] platform.”

Out of the 8.2 million videos that have been removed, almost 6.7 million have been removed though automated flagging. This means that a overwhelming majority of the videos were flagged by YouTube’s algorithms without any human eyes seeing it before it was flagged. YouTube argues that by using its algorithms to flag content over human operators leads to “more people reviewing content, not fewer.”

Advertisement

How to Make a

DIY Green Screen

Free eBook

Free

How to Make a

DIY Green Screen

Free

Thanks! We will email your free eBook.


Now, YouTube’s algorithms mostly forwards videos they suspects to be in violation to human operators and doesn’t often remove the videos itself. It does remove videos it deems as spam, however.

YouTube claims its algorithms are “paying off across high-risk, low-volume areas (like violent extremism) and in high-volume areas (like spam).”

According to YouTube, before it started using its algorithms at the beginning of 2017, 8 percent of the videos being removed for violent extremist content were removed before the video reached 10 total views. Now, with its algorithms, that figure has jumped to 50 percent. Additionally, they report that from October to December, 75.9 percent of the videos that were automatically flagged were removed before receiving any views.

The big takeaway from this report is that YouTube is trying to reassure people that its algorithms are working, even though that claim has been questioned in the past. Check out the entire enforcement report for more information about what YouTube has been doing to enforce its policies and Community Guidelines.

Sean Berry is a blogger and Videomaker's Associate Editor.