Apple announced its intention to unroll a new update that would allow the company to detect for images of child sexual abuse stored in iCloud Photos. This announcement came paired with two new features designed to similarly protect against child abuse.
Along with the iCloud feature, the company plans to launch a new tool within the Messages app that would warn children and their parents about the receiving or sending of sexually explicit photos. Additionally, Apple announced its intention to expand guidance in Siri and Search to protect children from “unsafe situations.”
News of these updates was first reported in the Financial Times where the paper wrote that the detection feature would “continuously scan photos that are stored on a U.S. user’s iPhone” with harmful material being alerted to law enforcement. This announcement caught some privacy experts by surprise given the route Apple took in 2016 when it refused to unlock the San Bernardino terrorists’ phone upon receiving a request from the FBI.
Matthew Green, a cryptography professor at Johns Hopkins University, reacted on Twitter saying, “Eventually it could be a key ingredient in adding surveillance to encrypted messaging systems.” He followed this by saying, “Imagine what it could do in the hands of an authoritarian government?”
Apple said in a statement released after the Financial Times report that its detection system is designed with “user privacy in mind.” Instead of scanning images on the Cloud, it said the “system performs on-device matching using a database” of known child abuse images compiled by the National Center for Missing and Exploited Children (NCMEC). Apple wrote it transforms that database material into unreadable “hashes” that are stored on the users’ device.
“Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known [child sexual abuse] hashes,” the company wrote. “This matching process is powered by a cryptographic technology called…