Dallas-based AT&T Corp. and several other major companies have pulled spending from YouTube after the businesses’ advertisements were displayed on videos with pedophilic comments and activity.
The Google-owned video platform became embroiled in scandal this week after a YouTuber posted a video showing evidence of what he called a “wormhole” into soft-core pedophilia on the site.
“Until Google can protect our brand from offensive content of any kind, we are removing all advertising from YouTube,” an AT&T spokesperson said in an email.
The video, posted last Sunday by content creator Matt Watson, illustrated how YouTube’s algorithm would apparently route users from regular videos to content featuring children within just a few clicks.
While the content with children is often innocuous, comments on the videos are filled with pedophilic statements. Perhaps even more alarming, once a user arrives at one such video, YouTube’s algorithm seems to suggest only similarly “suggestive” content, usually featuring young girls.
Some of the videos of children highlighted by Watson were uploaded by the minors themselves and show innocent activities such as stretching, dancing or simply talking to the camera. Others are uploaded from accounts that seem to collect videos of minors.
Comments on the videos time-stamp moments where the children’s bodies are exposed in some way or where children do something that commenters consider sexually suggestive. The time stamps are direct links to those frames in the video, leading others to the supposedly provocative moments.
But the comments get much worse than suggestions, Watson said in his video, noting that some people use the section to exchange social contacts or WhatsApp numbers or even links to child porn.
Since last week, multiple advertisers including Disney, Nestle and Epic Games have stopped advertising on YouTube.
“While investigations are on-going directly with YouTube and our partners, we have decided to pause advertising on YouTube globally, already effective in North America and several other markets,” a Nestle spokesperson said in an email. “We will revise our decision upon completion of current measures being taken by Google to ensure Nestle advertising standards are met.”
YouTube responded to a request for comment Friday afternoon, outlining in an email what the company has done in the last two days to tackle the issue.
A spokesperson said YouTube had disabled comments on tens of millions of videos featuring minors and terminated more than 400 channels for their comments. The company said it removed dozens of videos that would put young people at risk and changed search auto-completes that led to the content.
“Any content — including comments — that endangers minors is abhorrent, and we have clear policies prohibiting this on YouTube,” the spokesperson said. “We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling comments on tens of millions of videos that include minors.”
The controversy is reminiscent of scandals in 2017 that led dozens of brands to drop advertising on YouTube after it was shown alongside violent, hateful or inappropriate content.
Watson’s video, which had been viewed 2.7 million times as of Saturday, took specific issue with YouTube’s oft-criticized standards for demonetizing content, rules that creators say are harmful and unnecessary. Some of the children’s videos with inappropriate comments were monetized, which is why advertisements were shown on or next to those videos.
Now, the company says it will limit or prohibit ads on videos featuring minors or videos that receive predatory comments. Children under 13 are not allowed to create or own accounts under YouTube’s terms of service.
“There’s more to be done, and we continue to work to improve and catch abuse more quickly,” a YouTube spokesperson said.