Top Headlines

Feeds

West Virginia Attorney General Files Negligence Suit Against Apple Over iCloud CSAM Storage

Updated (4 articles)

West Virginia AG Files Lawsuit Thursday On February 19, 2026, West Virginia Attorney General JB McCuskey filed a negligence lawsuit against Apple, alleging the company allowed child sexual abuse material (CSAM) to be stored and shared through its iCloud service. The complaint seeks statutory and punitive damages, injunctive relief, and mandatory implementation of detection measures. McCuskey described each view of the images as revictimizing children and called Apple’s inaction “despicable” [1].

Apple Accused of Prioritizing Privacy Over Safety The suit argues Apple’s control of hardware, software, and cloud infrastructure prevents it from claiming ignorance of CSAM on its platforms. Internal communications cited in the filing include a 2020 text in which an executive labeled Apple “the greatest platform for distributing child porn” because of its privacy focus. The complaint contends Apple abandoned its NeuralHash detection model after backlash in 2021, leaving the company with less effective tools than competitors [1].

Federal Reporting Requirements Highlight Reporting Gap Under 18 U.S.C. § 2258A, tech firms must report detected CSAM to the National Center for Missing & Exploited Children. The complaint notes Google reported 1.47 million CSAM detections in 2023, while Apple reported only 267, suggesting non‑compliance or inadequate detection. The AG’s office argues Apple’s low reporting volume violates federal law and endangers children [1].

Apple Points to Communication Safety Features Apple responded by emphasizing its “Communication Safety” feature, which blurs nudity and warns users in Messages, FaceTime, AirDrop, Contact Posters, and Photos. The company also highlighted parental controls designed to protect children while preserving privacy. Apple maintains it is innovating to combat threats, despite the lawsuit’s claims of insufficient detection technology [1].

Sources

Related Tickers

Timeline

April 2020 – Joshua Aaron launches ICEBlock as an “early warning system” to alert users when ICE agents operate nearby, aiming to help people avoid contact without interfering with law‑enforcement activities[2].

2020 – An internal Apple text labels the company “the greatest platform for distributing child porn” because of its privacy‑first architecture, highlighting internal awareness of CSAM risks[1].

2021 – Apple abandons its NeuralHash detection model after a privacy backlash, opting out of a tool that could have identified illegal images more effectively than its current methods[1].

2023 – Under federal CSAM‑reporting law, Google submits 1.47 million reports to NCMEC, while Apple reports only 267, suggesting a significant compliance gap[1].

October 2025 – Apple removes ICEBlock from the App Store following a Justice Department request and pressure from Florida Attorney General Pam Bondi, citing policy that the app could endanger federal officers; existing users retain functionality but new downloads are blocked[2].

October 2025 – Google pulls several immigration‑tracking apps from Google Play, though ICEBlock never appears on Android, reflecting broader platform actions against similar tools[3].

December 8, 2025 – Joshua Aaron files a federal lawsuit in Washington, D.C. (and Texas) accusing Trump‑era officials of pressuring Apple to delete ICEBlock and violating his First Amendment rights; the suit seeks a declaration of protected speech and protection from prosecution[2][3].

December 2025 – House Homeland Security Chair Andrew Garbarino and Oversight Subcommittee Chair Josh Brecheen send letters to Apple and Google demanding information on apps that enable anonymous reporting of DHS activities, signaling congressional scrutiny of platform moderation[3].

February 19, 2026 – West Virginia Attorney General JB McCuskey files a negligence lawsuit against Apple over iCloud storage of child sexual‑abuse material, demanding statutory and punitive damages, injunctive relief, and mandatory detection tools; he calls Apple’s inaction “despicable” and “inexcusable”[1].

February 2026 – Apple replies that its “Communication Safety” feature blurs nudity and warns users across Messages, FaceTime, AirDrop, Contact Posters, and Photos, asserting that parental controls are built around safety, security, and privacy[1].

All related articles (4 articles)

External resources (3 links)