Apple Software Could Detect Child Sexual Abuse Material On iPhones

iOS VPN App

Protect Your Access to the Internet


In this photo illustration the stock trading graph of Apple...

In this photo illustration the stock trading graph of Apple…

Source: SOPA Images / Getty

Apple is preparing to release a new groundbreaking piece of software that just might save a child’s life.

On August 5, the iPhone maker announced plans of rolling out a new tool called “Neural Hash” which can help determine if an individual is storing child sexual abuse material (CSAM) on iCloud. The tech giant issued a statement about how the new software will be implemented into action on future iPhones.

“This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.”

According to Complex, “Once the automated system finds a match, a human will review the image in a question and assess whether it is illegal.”

Upon further review, if the software detects any child pornography, the user’s account will be automatically deactivated and reported to the National Center for Missing and Exploited Children (NCMEC).

The news was met with criticism from some iPhone users who are concerned their privacy may be violated in the process. Matthew Green, a cryptography professor at Johns Hopkins University, issued a series of tweets stating that the new technology could be harmful if placed in the wrong hands.

“Initially, I understand this will be used to perform client-side scanning for cloud-stored photos. Eventually, it could be a key ingredient in adding surveillance to encrypted messaging systems,” he wrote.

“This sort of tool can be a boon for finding child pornography in people’s phones. But imagine what it could do in the hands of an authoritarian government?” he added.

Apple reassured its users that the new applications will use multiple layers of encryption, fashioned in a way that requires multiple steps before anything becomes flagged. The company says it hopes their new CSAM detection will help provide valuable information to law enforcement on collections of CSAM in iCloud Photos.

What…

Source…