YouTube’s Algorithm Accused of Facilitating Paedophile Rings

Tim Cross 19 February, 2019 

YouTubeYouTube has found itself embroiled in yet another brand safety crisis after YouTuber Matt Watson claimed to have uncovered a “wormhole into a soft-core paedophile ring” on the platform. Watson, in a video released earlier this week, claimed that despite YouTube’s efforts to better moderate its content, paedophiles are still able to use the platform to share sexually suggestive content featuring children, which in some cases is monetised via ads.

Watson said that using a newly created YouTube account and a virtual private network (in order to eliminate any bias from YouTube’s recommendation algorithm), YouTube’s recommendation sidebar would lead him to exploitative content involving children “within about five clicks” of certain innocuous content.

In the video, Watson starts by searching for ‘bikini haul’ videos (videos where adults try on bikinis – which may be sexually suggestive but are altogether non-offensive). By clicking on recommended videos in YouTube’s sidebar, he is very quickly led to a video featuring two young girls in swimsuits. Once he’s clicked on this video, the sidebar becomes filled with similar content involving prepubescent girls in swimsuits or pyjamas.

The videos themselves aren’t necessarily offensive in and of themselves – some may be family videos uploaded onto YouTube, or videos recorded by the girls themselves. But it seems clear from the comments that they are being shared between paedophiles.

Commenters in many cases will give time-stamps for moments in the video where the girls are in “sexually implicit positions”, according to Watson, and often make comments about the girls’ appearances. Watson also claimed he was able to find external links to child porn sites in these comment sections.

Perhaps what’s most concerning on YouTube’s part is the active role its algorithm plays in the sharing of this content. The videos aren’t all appearing in the recommended sidebar because they are uploaded by the same malicious accounts – rather YouTube’s algorithm appears to be pulling together innocent videos from disparate sources and facilitating paedophiles trying to share them, and find more similar videos.

And some of the videos are being monetised via ads too. Watson said that he came across ads for brands including Disney, McDonald’s, Ikea and Reese’s which were run as pre-roll ads or display ads alongside the clips.

YouTube said that it is working to improve its policing of any content and comments which endangers minors. “We enforce these policies aggressively, reporting it to the relevant authorities, removing it from our platform and terminating accounts,” it said in a statement. “We continue to invest heavily in technology, teams and partnerships with charities to tackle this issue.”

But whether brands will be satisfied with the progress YouTube is making remains to be seen, particularly given that this isn’t the first problem with child abuse content that YouTube has faced. Over the past few years, the company has come under fire for allowing malicious actors to exploit its algorithm and expose children to sexually suggestive and disturbing content, and for hosting videos which showed children in paid and distress.

And the news follows separate stories which have called YouTube’s ability to properly moderate its content. The fact that YouTube wrongly banned several innocuous channels for hosting sexual content while missing the videos and comments uncovered by Watson might raise eyebrows, and might YouTube’s recent decision to stop helping agencies pay fees for third-party brand safety services like OpenSlate and Pixability.

2019-02-19T15:01:23+01:00

About the Author:

Tim Cross is Assistant Editor at VideoWeek.
Go to Top