YouTube is serving ads on and recommending videos featuring paedophilia: Report

Since the report, Nestlé, Disney and Fortnite have withdrawn their pre-roll ads.


Some disturbing revelations about content on YouTube and the way the platform monetises this content, have come forward.

Earlier this week, Matt Watson, a YouTuber, posted a 20-minute clip on the platform, which details how comments on the video streaming website are being used to identify "soft-core paedophilia rings on YouTube". The same was also confirmed by a lot of Reddit users. A lot of prominent brands such as Disney, Nestle and Epic Games among others have started pulling off their ads from YouTube after it was found that pre-roll ads from these companies were seen on the controversial videos.

The report of paedophilic content on YouTube was first reported in 2017 by The Verge, and even then, many big companies had pulled their ads.

YouTube logo. Reuters

YouTube logo. Reuters

Watson revealed that in the sea of content that is available on YouTube, there is a hidden genre of videos where young girls and boys are seen doing activities that could be construed as sexually suggestive, like them playing twister, or performing gymnastics, and even some of the young kids showing their exposed rear, underwear and genitals in some cases. These videos have views in hundreds of thousands with some even clocking in a million views. Watson even demonstrated how searching for certain keywords, could possibly lead you down the rabbit hole of such abusive content.

In his video, Watson talks about how when you view one such video, YouTube's algorithm leads you to another one of the same genre. Essentially, this wave of degenerate viewers is indirectly aided by YouTube’s algorithm, which does what it has been designed to do by showing viewers what it thinks they’ll want to watch next.

However, in this case, it is both actively and passively lending a hand in the production and distribution of content featuring paedophilia.

Paralelly, the comments on these videos are flooded by scores of paedophiles sharing timestamps for parts of the videos that could be seen as sexually suggestive (by other such sick people!).

What makes this scenario even worse is that these videos are also being monetised on YouTube, including pre-roll adverts from big name brands such as Fiat, Fortnite, Grammarly, Nestlé, Disney, L’Oreal and Maybellineaccording to a report by Wired.

Since the reports about these videos featuring child exploitation came out, Nestlé, Disney and Fortnite-creator Epic Games have withdrawn their ads from YouTube, according to reports by The Verge and Bloomberg

For years now, YouTube is battling with inappropriate content on its platform. While YouTube has been taking down some popular channels for child abuse revelations (according to a report by BBC, and another in case of FamilyOFive), this attempt still feels ineffective with still a number of videos up there of "'pre-teen models' and groups of young girls bathing, doing stretches", per Wired.

Meanwhile, a YouTube spokesperson told Wired that it has aggressively been implementing its policies against such content and that if any video or even comments that endangers minors are immediately reported to relevant authorities, relevant action will be taken. Reportedly, the spokesperson added that such material is also removed from YouTube, with associated accounts deleted. 

Unfortunately and somehow, despite these efforts, many such accounts continue to live and thrive on the platform.

In case, you come across any such video or even comments which are trying to promote child exploitation in any way, do report such content on YouTube or select the Report option below any video.


Find latest and upcoming tech gadgets online on Tech2 Gadgets. Get technology news, gadgets reviews & ratings. Popular gadgets including laptop, tablet and mobile specifications, features, prices, comparison.