Child protection advocates are praising Apple's plan to roll out new anti-child-pornography features later this year—but privacy advocates worry about what else it might lead to. The company says the software, called neuralMatch, will scan images on a user's iPhone before they are uploaded to the iCloud storage service to see if they match images in a database of child sexual abuse images, the Wall Street Journal reports. If there is a match, the phone will be disabled and the user will be reported to the National Center for Missing and Exploited Children, which works with law enforcement agencies.
- Context. The Washington Post says the type of matching being done is something companies like Facebook already do. "But in those systems, photos are scanned only after they are uploaded to servers owned by companies like Facebook." In looking at what's on a user's device, Apple is treading into new "client-side" surveillance territory.
- Apple's reassurances. In a blog post, Apple says its technology ensures there is only "a one in one trillion chance per year of incorrectly flagging a given account," and all flagged accounts will be reviewed by a human.
- A criticism. The Electronic Frontier Foundation, however, called the move "a shocking about-face for users who have relied on the company’s leadership in privacy and security," ABC reports. Other privacy advocates warned that the technology to scan user's phones for banned content could be abused by authoritarian regimes.
- One worrisome scenario. The AP talks with Matthew Green, a Johns Hopkins cryptography researcher, who gives this hypothetical: Someone could manipulate the system to frame a person by sending them "seemingly innocuous images designed to trigger matches for child pornography. That could fool Apple’s algorithm and alert law enforcement." He said researchers have managed to do this.
- In support. Hany Farid of the University of California at Berkeley has developed technology to spot child pornography online and said any concerns should be outweighed by the need to protect children. Farid noted that other programs designed to protect devices against threats haven't seen the "mission creep" that privacy advocates are warning about.
- One other use. Apple also plans to add tools to its encrypted iMessage service to stop children receiving—or sending—sexually explicit images, reports the Post. The images will be blurred and minors will be warned that their parents could be notified if they choose to view it.
(Read more Apple