Should we Celebrate or Condemn Apple’s New Child Protection Measures?

iOS VPN App

Protect Your Access to the Internet


Last week, Apple announced that it would deploy hashing technologies to protect children from sexual abuse and exploitation. In response, child-rights advocates cheered and privacy-rights advocates jeered. Both, however, are putting too much stock in Apple’s announcement, which is neither cause for celebration nor denunciation.



MADRID, SPAIN - MAY 2021: Apple Logo in Apple Store in Puerta del Sol on May 28, 2021 in Madrid, Spain.


© Cristina Arias/Cover/Getty Images
MADRID, SPAIN – MAY 2021: Apple Logo in Apple Store in Puerta del Sol on May 28, 2021 in Madrid, Spain.

Each year, hundreds of millions of images and videos of child sexual abuse circulate online. The majority of children in these materials are prepubescent, and many of them are infants and toddlers. In addition, every day children are exposed to unsolicited sexual advances and sexual content online. We must do more to protect our children, both online and offline.

Loading...

Load Error

For the past two decades, the technology industry as a whole has been lethargic, even negligent, in responding to the threats posed by the global trade of child sexual abuse material (CSAM), live-streaming of child sexual abuse, predatory grooming and sexual extortion. At the same time, the industry has made every effort to make sure its products and services get into—and remain in—the hands of children.

Since 2010, after years of cajoling, many social media platforms, email services and cloud-storage services have deployed perceptual-hashing technology to prevent redistribution of previously identified CSAM content. Previously unseen material and other online risks for children, however, remain a threat. Child-rights activists consider hashing technology a bare-minimum protection that any responsible online provider should deploy, not a gold standard.

To its credit, Apple has used hashing technology in its email services for years. Its most recent announcement simply extends the reach of this technology to operate on any image stored in iCloud. This technology, as with previous hashing technologies, is extremely accurate. Apple expects an error rate of 1 in 1 trillion; to further ensure reporting accuracy, any matched image is manually reviewed by Apple and the National Center for Missing and Exploited Children before the…

Source…