West Virginia sues Apple over CSAM on iCloud
State litigation targets encryption defaults and pushes mandatory cloud scanning, Think of the children becomes policy lever for surveillance-by-design
Images
West Virginia sues Apple, accuses tech giant of letting iCloud become hub for child sexual abuse material
foxnews.com
JB McCuskey
foxnews.com
The Apple Fifth Avenue store in New York, US, on Tuesday, Oct. 28, 2025.
foxnews.com
Apple News UNDER FIRE over 'suppressed' conservative content
foxnews.com
West Virginia has sued Apple, alleging the company’s iCloud service has become a conduit for child sexual abuse material (CSAM) because Apple refuses to deploy “industry-standard” scanning tools across its cloud storage.
According to Fox News, the lawsuit—filed in Mason County Circuit Court—claims Apple is an “outlier” compared with Meta and Google, which generate far more reports to law enforcement related to CSAM. West Virginia’s attorney general, JB McCuskey, argues Apple has prioritized privacy and encryption over child safety, and the complaint seeks to force Apple to implement detection measures that scan user cloud storage.
The state’s own framing is revealing: this is not a narrow dispute about one criminal investigation, but a bid to set policy through litigation. If a state can sue a platform for not surveilling users “enough,” encryption becomes a liability rather than a feature—an inversion that would have pleased every would-be wiretapper since the telegraph.
Fox reports that the complaint cites internal messages attributed to Eric Friedman, Apple’s former anti-fraud chief, describing iCloud as “the greatest platform for distributing child porn,” and adding that Apple has “chosen to not know in enough places where we really cannot say.” Whether those messages survive evidentiary scrutiny is for the courts. But the political objective is already clear: create a record suggesting Apple knowingly designed a blind spot.
Apple, in a statement quoted by Fox, emphasized child-safety features like Communication Safety—interventions on kids’ devices when nudity is detected in Messages, Photos, AirDrop, and FaceTime—without directly conceding the demand at issue: routine scanning of adult users’ iCloud content.
The problem is not that CSAM is anything but evil. It’s that the “think of the children” playbook reliably expands from targeted policing to general surveillance. Mandated scanning of cloud storage is functionally a requirement to weaken privacy by default—either through server-side inspection, client-side scanning, or architecture that makes encrypted storage less meaningful. And once that infrastructure exists, it won’t stay confined to CSAM. The same state actors will demand it for drugs, extremism, “misinformation,” and whatever else becomes the moral panic of the quarter.
This case is also a jurisdictional experiment: can a state attorney general litigate his way into setting national encryption norms for a global product? If West Virginia succeeds, other states will copy the tactic, and Apple will face a patchwork of de facto mandates—precisely the kind of regulatory chaos that invites federal “harmonization.”
The lawsuit is not just about iCloud. It’s about whether states can use courts as a backdoor legislature to turn private communications infrastructure into a permanent inspection regime.