Photo Credit: Getty Images
YouTube has deleted more than 270 accounts and over 150,000 videos from its site, and turned off commenting on another 625,000 videos targeted by child predators, according to Vice News.
This is the second time in less than a year that major advertisers are running from the platform after ads were coupled with offensive content.
Advertising partners including Adidas, Deutsche Bank, Cadbury, and Hewlett-Packard have frozen advertising on the platform after finding that their advertisements were being played before or alongside abusive or inappropriate child content.
“Over the past week we removed ads from nearly 2 million videos and over 50,000 channels masquerading as family-friendly content,” YouTube said in a statement to Vice. “Content that endangers children is abhorrent and unacceptable to us.”
YouTube has had previous issues with the child-exploitation material being uploaded onto its platform. In many circumstances, this content received millions of views.
Last week, investigations and The Times reported finding “obscene comments on videos of children uploaded to YouTube” alongside these videos.
Only a small portion of the comments were removed after being flagged via YouTube's 'report content' system.
The BBC said that “YouTube's comment moderation system failed to remove obscene comments targeting children.” Volunteers participating in Youtube’s (unpaid) Trusted Flagger program shed light on the creepy comments flooding Youtube’s channels.
Over a period of "several weeks," it said that five of the 28 obscene comments it had found and reported were deleted. No action was taken against the remaining 23 until the BBC directly contacted YouTube with a full list.
Youtube says all of the "predatory accounts" were closed within 24 hours.
This follows outrage over the media site allegedly autofilling search results with pedophiliac terms last week.
For example, when a user searched something like "how to," the autocomplete generator suggested "have s*x kids" and "have s*x with your kids." Originally reported by BuzzFeed, the terms have since been removed.
On this, the YouTube spokesperson added: "Earlier today our teams were alerted to this profoundly disturbing autocomplete result and we worked to quickly remove it as soon as we were made aware. We are investigating this matter to determine what was behind the appearance of this autocompletion."
As noted by The Guardian, YouTube's autocomplete uses algorithms based on frequently-used search terms. Meaning, these creepy search terms could be a result of a group of people working to make a term trend on Youtube. The Guardian also reports that none of the results linked to the autocompleted search showed abusive videos.