Apple is reportedly planning to roll out software that will scan US iPhone photos for images of child sexual abuse, the Financial Times reported on Thursday.
Apple could announce more about the software in the coming week, according to the report, which cited security researchers familiar with Apple’s plans.
The software, reportedly called neuralMatch, is designed to look through images that have been stored on iPhones and uploaded to iCloud storage. According to the Financial Times, if the software detects child sexual abuse in a photo, it will then pass the material on to human reviewers who will alert law enforcement if they think the images are illegal.
However, security experts warned that this could snowball beyond looking for child sexual abuse images.
“Whether they turn out to be right or wrong on that point hardly matters. This will break the dam – governments will demand it from everyone,” Matthew Green, a cryptographer at Johns Hopkins University, said on Twitter.
-Matthew Green (@matthew_d_green) August 5, 2021
An Apple spokesperson did not immediately respond to Insider’s request for comment, and the company declined to comment to the Financial Times.
Apple makes privacy a selling point, at times frustrating law enforcement
This new software, if implemented, would likely please law enforcement and government agencies, but risks potential backlash from privacy activists. Apple has made privacy features a cornerstone of its marketing in recent years, advertising that “what happens on your iPhone stays on your iPhone.”
But there are limits to this promise, and tradeoffs. Apple already monitors images sent from Apple devices for child abuse imagery, using a technique called “hashing,” and alerts law enforcement when the algorithm and an Apple employee detect suspected child abuse material. It also cooperates with law enforcement on lawful requests for information.
“Our legal team reviews requests to ensure that the requests have a valid legal basis,” Apple writes on its…