Jump to content




The legal fight that could force Apple to rethink iCloud design

Featured Replies

rssImage-77c00c4b33407de16de9efc42dd30db9.webp

West Virginia’s attorney general filed a lawsuit against Apple on Thursday accusing the iPhone maker of knowingly allowing its software to be used for storing and sharing child sexual abuse material. 

John B. McCuskey, a Republican, accused Apple of protecting the privacy of sexual predators who use iOS, which can sync images to remote cloud servers through iCloud. McCuskey called the company’s decisions “absolutely inexcusable” and accused Apple of running afoul of West Virginia state law.

“Since Apple has so far refused to police themselves and do the morally right thing, I am filing this lawsuit to demand Apple follow the law, report these images, and stop re-victimizing children by allowing these images to be stored and shared,” McCuskey said. 

The West Virginia attorney general said the state would seek “statutory and punitive damages,” changes to Apple’s child abuse imagery detection practices and other remedies to make the company’s product designs “safer going forward.”

In the new lawsuit, the state cites a handful of known complaints about Apple’s mostly hands-off approach to its image hosting service. The biggest concern: Apple finds far fewer instances of online child exploitation than its peer companies do because it isn’t looking for them. 

In a statement provided to Fast Company, Apple pointed out an iOS feature that “automatically intervenes” when nudity is detected on a child’s device. “All of our industry-leading parental controls and features… are designed with the safety, security, and privacy of our users at their core,” an Apple spokesperson said.

Apple walks the privacy tightrope

The West Virginia lawsuit isn’t the first of its kind that Apple has faced in recent years, though it is the first coming from a state. In late 2024, a group thousands of sexual abuse survivors sued the company for more than $1 billion in damages after Apple walked away from a plan to more thoroughly scan the images it hosts for sexual abuse material. In the case, the plaintiffs’ legal team cited 80 instances in which law enforcement discovered child sexual abuse imagery on iCloud and other Apple products. 

Most tech companies rely on a tool developed by Microsoft more than a decade ago to automatically scan images they host and cross-reference those images against digital signatures in a database of known child abuse imagery. That tool, known as PhotoDNA, flags those images and acts as the first step in a reporting chain that leads to law enforcement. 

In the U.S., internet platforms are required by law to report any instances of suspected child sexual abuse material (CSAM) to the National Center for Missing & Exploited Children, the organization that spearheads child abuse prevention online in the country. NCMEC collects tips from online platforms through a centralized CSAM reporting system known as the CyberTipline and forwards those concerns, many collected via PhotoDNA, to relevant authorities.

In 2023, NCMEC received only 267 reports of suspected CSAM from Apple. During the same time frame, the organization received 1.47 million reports from Google, 58,957 reports from Imgur and 11.4 million reports from Meta-owned Instagram.

Apple appears to know the extent of the problem. “We are the greatest platform for distributing child porn,” Apple executive Eric Friedman said in an infamous 2020 text message that surfaced in discovery during the lengthy court battle between Apple and Fortnite maker Epic Games. Friedman made the statement in a conversation weighing whether the company’s policies are weighted too heavily toward user privacy rather than safety. 

Apple is known for robust privacy practices that make its products famously safe from potential hackers. Over the years, those same encryption systems have frustrated law enforcement agencies like the FBI who have sought data locked away on iPhones in the course of their investigations.

“At Apple, protecting the safety and privacy of our users, especially children, is central to what we do,” an Apple spokesperson said. “We are innovating every day to combat ever-evolving threats and maintain the safest, most trusted platform for kids.”

View the full article





Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.

Account

Navigation

Search

Configure browser push notifications

Chrome (Android)
  1. Tap the lock icon next to the address bar.
  2. Tap Permissions → Notifications.
  3. Adjust your preference.
Chrome (Desktop)
  1. Click the padlock icon in the address bar.
  2. Select Site settings.
  3. Find Notifications and adjust your preference.