Apple software head says plan to scan iPhones for child abuse images is ‘misunderstood’

iOS VPN App

Protect Your Access to the Internet


p1002916-3

Apple unveiled its plans to fight child abuse imagery last week.


Patrick Holland/CNET

Apple plans to scan some photos on iPhones, iPads and Mac computers for images depicting child abuse. The move has upset privacy advocates and security researchers, who worry that the company’s newest technology could be twisted into a tool for surveillance and political censorship. Apple says those concerns are misplaced and based on a misunderstanding of the technology it’s developed.

In an interview published Friday by The Wall Street Journal, Apple’s software head, Craig Federighi, attributed much of people’s concerns to the company’s poorly handled announcements of its plans. Apple won’t be scanning all photos on a phone, for example, only those connected to its iCloud Photo Library syncing system. And it won’t really be scanning the photos either, but rather checking a version of their code against a database of existing child abuse imagery.

“It’s really clear a lot of messages got jumbled pretty badly in terms of how things were understood,” Federighi said in his interview. “We wish that this would’ve come out a little more clearly for everyone because we feel very positive and strongly about what we’re doing.”

Read more: Apple, iPhones, photos and child safety: What’s happening and should you be concerned?

For years, Apple has sold itself as a bastion of privacy and security. The company says that because it makes most of its money selling us devices, and not by selling advertisements, it’s able to erect privacy protections that competitors like Google won’t. Apple’s even made a point of indirectly calling out competitors in its…

Source…