In a recent report, the Internet Watch Foundation (IWF) disclosed that over 90% of child sexual abuse imagery found online is now self-generated. The foundation identified self-generated content featuring children under 10 on more than 100,000 webpages in the past year, marking a 66% increase from the previous year. A record 275,655 webpages were confirmed to contain Child Sexual Abuse Material (CSAM), reflecting an 8% increase, prompting the UK government, supported by the IWF, to renew its criticism of end-to-end encryption. The government argues that the rise in detected imagery doesn’t necessarily indicate a problem, as improved detection methods contribute to the increase. Susie Hargreaves, the CEO of the IWF, emphasised that while detecting more content is a positive outcome, the ultimate mission is the elimination of child sexual abuse rather than just removal. Shockingly, some self-generated content involved children as young as three years old, with a fifth categorised as causing the most severe harm. Hargreaves expressed concern about Meta’s plans to implement end-to-end encryption for Messenger, fearing it would hinder efforts to combat the spread of CSAM. Additionally, she criticised Apple for abandoning plans to scan for CSAM on iPhones, calling such decisions baffling in the context of the increasing prevalence of such content on the web. The charity’s latest figures, compiled from 2023 data, reinforce their opposition to Meta’s encryption plans. Hargreaves argues that turning a blind eye to the content shared on platforms like Messenger could provide a safe space for criminals to share and spread abusive imagery undetected. UK Security Minister Tom Tugendhat echoed these concerns, stating that the alarming rise in online child sexual abuse, affecting increasingly younger victims, necessitates stronger measures. He criticized Meta’s decision to implement end-to-end encryption without essential safety features, emphasizing the potential catastrophic impact on law enforcement’s ability to bring perpetrators to justice. In response, Meta defended its approach, highlighting its ongoing efforts to combat child abuse through safety measures. The company emphasized its commitment to online security while providing more reports to the National Center for Missing & Exploited Children (NCMEC) than others. Apple, on the other hand, did not respond to inquiries about its decision to delay plans for client-side scanning of iPhones.
A record 275,655 webpages were confirmed to contain Child Sexual Abuse Material (CSAM), reflecting an 8% increase, prompting the UK government, supported by the IWF, to renew its criticism of end-to-end encryption
Advertisement
End of Article