Big Tech /

Apple to Scan U.S. Phones for Images of Child Sexual Abuse

The company says, "Our goal is to create technology that empowers people and enriches their lives — while helping them stay safe"


Apple announced new technology that will scan U.S. iPhones for images of child sexual abuse.

The announcement has the support of several child welfare groups while also catching criticism from concerned security advocates.

The tool, called neuralMatch, “will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified,” explains AP News. “The detection system will only flag images that are already in the center’s database of known child pornography.”

A top cryptography professor at Johns Hopkins University, Matthew Green, warned that the technology could be used to frame innocent people by sending them images meant to trigger neutralMatch.

According to Newsweek, “news of these updates was first reported in the Financial Times where the paper wrote that the detection feature would ‘continuously scan photos that are stored on a U.S. user’s iPhone’ with harmful material being alerted to law enforcement. This announcement caught some privacy experts by surprise given the route Apple took in 2016 when it refused to unlock the San Bernardino terrorists’ phone upon receiving a request from the FBI.”

John Clark, chief executive of the National Center for Missing & Exploited Children, hailed the new software as “lifesaving.”

In a statement, he said, “With so many people using Apple products, these new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material … The reality is that privacy and child protection can co-exist.”

Apple’s chief privacy officer, Erik Neuenschwander, said on a media call Thursday that people whose images are flagged to law enforcement would not be notified and that their accounts would be disabled.

“The fact that your account is disabled is a noticeable act to you,” he said.

Users who feel their account was improperly suspended can appeal to have it reinstated, says Reuters.

“Tech companies including Microsoft, Google, Facebook, and others have for years been sharing ‘hash lists’ of known images of child sexual abuse. Apple has also been scanning user files stored in its iCloud service, which is not as securely encrypted as its messages, for such images,” esports Fox8.

On the company’s website, Apple also says it will add new child safety features to the Messaging app so parents can have a “more informed role in helping their children navigate communication online.”

The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple,” the page says.

U.S. accounts set up as families in iCloud for iOS 15, iPadOS 15, and macOS Monterey will have access to the communication safety tool. It will arrive in an update later this year.

*For corrections please email [email protected]*