Apple software chief admits child protection measures have been ‘widely misunderstood’


Apple’s head of software, Craig Federighi, has told The Wall Street Journal’s Joanna Stern that the company’s new Child Safety measures have been “widely misunderstood.

In an interview, Federighi said that Apple wished the measures had come out a little more clearly, following a wave of controversy and adverse reaction to the measures. Federighi told Joanna Stern that “in hindsight”, announcing its new CSAM detection system and a new Communication Safety feature for detecting sexually explicit photos at the same time was “a recipe for this kind of confusion.”

Federighi says that “it’s really clear a lot of messages got jumbled pretty badly” in the wake of the announcement.

On the idea that Apple was scanning people’s phones for images, Federighi said “this is not what is happening.” He said, “to be clear we’re not actually looking for child pornography on iPhones… what we’re doing is finding illegal images of child pornography stored in iCloud”. Noting how other cloud providers scan photos in the cloud to detect such images, Federighi said that Apple wanted to be able to detect this without looking at people’s photos, doing it in a way that is much more private than anything that has been done before.

VPN Deals: Lifetime license for $16, monthly plans at $1 & more

Federighi stated that “a multi-part algorithm” that performs a degree of analysis on-device so that a degree of analysis can be done in the cloud relating to detecting child pornography. Federighi did in fact state the threshold of images is “something on the order of 30 known child pornographic images,” and that only when this threshold is crossed does Apple know anything about your account and those images and not any other images. He also reiterated Apple isn’t looking for photos of your child in the bath, or pornography of any other sort.

Pressed about the on-device nature of the feature Federighi said it was a “profound misunderstanding”, and that CSAM scanning was only being applied as part of a process of storing something in the cloud, not processing that was running over images stored on your phone.

On why now, Federighi said that Apple had finally “figured it out” and had wanted to deploy a solution to the…

Source…