Image of YouTube error face

YouTube and parent-company Google have recently been working to manage offensive and extremist content that both meets and doesn't meet their policies, and Google has announced that they will being isolating YouTube videos that may not be in direct violation of their standards, but still contain “controversial religious or supremacist content.”

In a more extreme take on the tactics already employed to prevent ads from running against controversial content, if a video happens to fall into what is considered as offensive, Google and YouTube won’t run ads, and the video’s creators won’t be able to make money off their content either. In addition, comments will be disabled and the videos won’t be included in any recommended lists and won’t play on other websites that they’ve been embedded to. A warning screen will also appear before the video starts when you play it on YouTube.

What Google and YouTube are trying to do is greatly lower engagement with these offensive or extremist videos and the public. It will be harder for these videos to gain attention, especially when they aren’t being listed in the recommended lists or allowing for comments.

Google has also rolled out a Redirect Method that sends users who are searching for flagged keywords to videos countering the extremist content searchers seem to be after. Google is also using machine learning-based video detection to track down and remove content that clearly violates YouTube policies.

Content creators aren’t completely powerless however. If their video happens to be flagged, they will receive a notice that their video has been deemed offensive. Users will then have the ability to appeal if they feel that the decision is unwarranted, but monetization and more extreme consequences will certainly be felt in the meantime as that appeal is processed.