YouTube Kid has recently been under a great deal criticism with recent reports that the children-friendly version of YouTube, YouTube Kids, is filled with inappropriate content that’s masked as children cartoons. YouTube Kids offers content that’s meant to be for children under the age of 13. However, it’s apparent that inappropriate videos have been able to make it past through YouTube’s algorithms, using animation, cartoons, and child-focused keywords to mask themselves as an appropriate videos for children. To prevent anymore inappropriate videos making it onto YouTube kids, YouTube's announced they will be rolling out an new policy that will age-restrict inappropriate videos that are masked as children’s content in the main YouTube app, preventing them from making it to YouTube Kids.
For a video to appear in the YouTube Kids app, it has to go through a filter managed by YouTube’s algorithm, which are supposed to be able to identify if a video is appropriate to be considered content for children. If it is found to be inappropriate, or if it violates any of YouTube’s policies, it isn’t let into the app. There will now also be human moderators that review videos that are flagged in the main YouTube app by volunteer Contributors or by systems that can identify popular children characters in videos that seems questionable.
If a video is found to not be suitable for YouTube Kids by a human moderator, it will be age-restricted in the main YouTube app.
This policy, according to YouTube, has been in the works for a while and it wasn’t made because of the recent reports surrounding inappropriate content on YouTube Kids.
YouTube this hoping that this new policy will prevent any inappropriate content from making it onto YouTube Kids. Tt takes a few days for content to make it from the main YouTube app into the YouTube Kids app, so that leaves time for their human moderators, Contributors, and this new policy to work to prevent any inappropriate content from going unnoticed.