Apple is under legal scrutiny after a woman, using a pseudonym to protect her identity, filed a lawsuit claiming the company failed victims of child sexual abuse by abandoning its CSAM detection plan.
The feature, announced in 2021, was supposed to scan iCloud images for child sexual abuse material (CSAM) using on-device technology. While Apple retained a nudity-detection tool for its Messages app, the CSAM detection feature was dropped in 2022 amid privacy and security concerns raised by experts and advocacy groups.
The lawsuit argues that Apple’s decision has perpetuated the sharing of illegal material online and broken the company’s promise to protect abuse victims like her.
Allegations of negligence
The 27-year-old plaintiff, who was abused as a child, claims Apple’s decision to eliminate CSAM detection left her vulnerable. She recounts that law enforcement informed her that images of her abuse were stored on iCloud through a MacBook seized in Vermont. She accuses Apple of selling “defective products” that fail to safeguard victims, calling for changes to Apple’s practices and compensation for impacted individuals.
The lawsuit highlights that other tech companies, like Google and Meta, employ CSAM-scanning tools that detect significantly more illegal material than Apple’s nudity-detection features. Her legal team estimates up to 2,680 victims could join the lawsuit, potentially resulting in damages exceeding $1.2 billion if Apple is found liable.
Similar cases raise concerns
This lawsuit isn’t Apple’s only legal battle concerning CSAM-related issues. In a separate case, a nine-year-old victim and her family sued Apple in North Carolina. The girl alleges that strangers used iCloud links to send her CSAM videos and encouraged her to create and upload similar material.
Apple has sought to dismiss this case, citing federal Section 230 protections, which shield companies from liability for user-uploaded content. However, recent court rulings suggest these protections may not apply if companies fail to actively moderate harmful content.
Apple’s response and ongoing debate
Apple has defended its stance, reiterating its commitment to combating child exploitation without compromising user privacy. The company highlighted innovations like nudity-detection in its Messages app and the ability for users to report harmful content.
However, the plaintiff’s lawyer, Margaret Mabie, argues that these measures are insufficient. Mabie’s investigation uncovered over 80 instances of the plaintiff’s images being shared, including by an individual in California who stored thousands of illegal images on iCloud.
As legal proceedings unfold, Apple faces growing pressure to balance user privacy with effective protections against harmful content.