More Trouble For YouTube Emerges as Reports Claim They Aren’t Doing Enough to Remove Inappropriate Comments

This past Friday, BBC and The Times did their own investigations into crude comments found on videos featuring minors on YouTube.

During their investigations, they found that only a small portion of the obscene comments were being flagged and removed through YouTube’s ‘report content’ system.  BBC says that they detected 28 obscene comments, and over “several weeks” only five of the 28 comments were deleted. No action was taken against the other 23 until BBC contacted YouTube themselves and provided a list of the comments they had been monitoring. After going to YouTube directly, the BBC reports that all the “predatory accounts” were deleted within 24 hours.

https://youtu.be/-Vx52-BOj5E

The Times also reported that they found advertisements from major brands being shown with inappropriate videos targeting children. Already, brands like Adidas, Deutsche Bank, Mars, Cadburys and Lidl are suspending their advertisement on YouTube.

A YouTube spokesperson has said that those ads were not supposed to be airing with those kinds of videos and that they are working urgently to fix this problem.

“There shouldn’t be any ads running on this content and we are working urgently to fix this. Over the past year, we have been working to ensure that YouTube is a safe place for brands. While we have made significant changes in product, policy, enforcement and controls, we will continue to improve,” said the spokesperson.

To further YouTube’s troubles, BuzzFeed reported that if you were to type “how to have” into YouTube’s search bar, an inappropriately sexual autofill search suggestion came up.

The YouTube spokesperson responded to this by saying that: “Earlier today our teams were alerted to this profoundly disturbing autocomplete result and we worked to quickly remove it as soon as we were made aware. We are investigating this matter to determine what was behind the appearance of this autocompletion.”

BBC reported that they have sources with knowledge about YouTube’s content moderation system who claim that associated links can be striped out of content reports submitted by the public, which means that YouTube employees may be unable to see which specific comment is being flagged. BBC also criticised YouTube’s lack of support for their members in the Trusted Flaggers program, noting that the members believe YouTube could be doing much more for them.

“We don’t have access to the tools, technologies and resources a company like YouTube has or could potentially deploy,” BBC was told. “So for example any tools we need, we create ourselves.”

“There are loads of things YouTube could be doing to reduce this sort of activity, fixing the reporting system to start with. But for example, we can’t prevent predators from creating another account and have no indication when they do so we can take action.”

Just last week, YouTube announced that they were doubling down on their efforts to combat content targeting children and that they were going to be taking an “aggressive stance” on preventing crude comments on videos featuring minors. It is possible that YouTube hasn’t yet enforced this policy at the time of the investigations, but these reports sure don’t help YouTube’s image amidst all of this controversy.

Sean Berry
Sean Berry
Sean Berry is Videomaker's managing editor.

Related Content