Apple child abuse material scanning in iOS 15 draws fire


On Friday, Apple revealed plans to tackle the issue of child abuse on its operating systems within the United States via updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.



diagram: Image: Apple


© Provided by ZDNet
Image: Apple

The most contentious component of Cupertino’s plans is its child sexual abuse material (CSAM) detection system. It will involve Apple devices matching images on the device against a list of known CSAM image hashes provided by the US National Center for Missing and Exploited Children (NCMEC) and other child safety organisations before an image is stored in iCloud.



graphical user interface


© ZDNet


“Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result,” Apple said.

Loading...

Load Error

“The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.”

Once an unstated threshold is reached, Apple will manually look at the vouchers and review the metadata. If the company determines it is CSAM, the account will be disabled and a report sent to NCMEC. Cupertino said users will be able to appeal to have an account re-enabled.

Apple is claiming its threshold will ensure “less than a one in one trillion chance per year of incorrectly flagging a given account”.

The other pair of features Apple announced on Friday were having Siri and search provide warnings when a user searches for CSAM-related content, and using machine learning to warn children when they are about to view sexually explicit photos in iMessages.

“When receiving this type of content, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo. As an additional precaution, the child can also be told that, to make sure they are safe, their parents will get a message if they do view it,” Apple said.

“Similar protections are available if a child attempts to…

Source…