Big Tech /

Apple to Scan U.S. Phones for Images of Child Sexual Abuse

The company says, "Our goal is to create technology that empowers people and enriches their lives — while helping them stay safe"

Apple announced new technology that will scan U.S. iPhones for images of child sexual abuse.

The announcement has the support of several child welfare groups while also catching criticism from concerned security advocates.

The tool, called neuralMatch, “will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified,” explains AP News. “The detection system will only flag images that are already in the center’s database of known child pornography.”

A top cryptography professor at Johns Hopkins University, Matthew Green, warned that the technology could be used to frame innocent people by sending them images meant to trigger neutralMatch.

According to Newsweek, “news of these updates was first reported in the Financial Times where the paper wrote that the detection feature would ‘continuously scan photos that are stored on a U.S. user’s iPhone’ with harmful material being alerted to law enforcement. This announcement caught some privacy experts by surprise given the route Apple took in 2016 when it refused to unlock the San Bernardino terrorists’ phone upon receiving a request from the FBI.”

John Clark, chief executive of the National Center for Missing & Exploited Children, hailed the new software as “lifesaving.”

In a statement, he said, “With so many people using Apple products, these new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material … The reality is that privacy and child protection can co-exist.”

Apple’s chief privacy officer, Erik Neuenschwander, said on a media call Thursday that people whose images are flagged to law enforcement would not be notified and that their accounts would be disabled.

“The fact that your account is disabled is a noticeable act to you,” he said.

Users who feel their account was improperly suspended can appeal to have it reinstated, says Reuters.

“Tech companies including Microsoft, Google, Facebook, and others have for years been sharing ‘hash lists’ of known images of child sexual abuse. Apple has also been scanning user files stored in its iCloud service, which is not as securely encrypted as its messages, for such images,” esports Fox8.

On the company’s website, Apple also says it will add new child safety features to the Messaging app so parents can have a “more informed role in helping their children navigate communication online.”

The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple,” the page says.

U.S. accounts set up as families in iCloud for iOS 15, iPadOS 15, and macOS Monterey will have access to the communication safety tool. It will arrive in an update later this year.

*For corrections please email [email protected]*

18 responses to “Apple to Scan U.S. Phones for Images of Child Sexual Abuse”

  1. ChetF says:

    As much as i would like to see pedo’s hauled off to jail this invasion of privacy just seems like a gateway to a complete nanny state by companies that already have to much power over the masses.

  2. magg0t82 says:

    An excuse to get access to your data

  3. Devilsgun says:

    So is Apple going to compare the creepo content of their iTard phone users against the contents of Hunter Biden’s laptop and/or the Epstein Files that all the megachomo elites fap to? Just curious

  4. CraggFirearms says:

    Good point…who’s watching the watchers. That’s always the problem with progressive policies, they always rely on some “authority” that makes subjective judgements and has access to far more information than should be socially acceptable.

  5. Viewtifuljoe says:

    Like Epstein or actual terrorists, the government knows who these people are (with conventional means) and will do nothing unless you’re a whistleblower then the state’s power will come after you.

  6. PolishPierogi says:

    Mistakes, glitches, mishaps, and misunderstandings. Its all so tiresome.

  7. UppityG says:

    Of course. They have that script well memorized.

  8. UppityG says:

    Could not agree more. Get rid of the Demons. So much suffering would naturally evaporate.

  9. TCappo3 says:

    Good idea

  10. TCappo3 says:

    Check and mate!

  11. Wolv256 says:

    I’m sure their arrests of conservatives will just be mistakes.

  12. Wolv256 says:

    Well, joke’s on them, they already banned me, I can’t use any of their apps anymore.

  13. Wolv256 says:

    Start with the Democrats. Another Democratic senator, Tony Navarette, just arrest for being a pedo. They are all monsters. Stop oppressing our freedom and just stop the Democrats, that’s who’s hurting all the children.

  14. Hsims says:

    They always get you by violating your privacy with something hard to argue. Who gonna argue FOR CP? 2025 they will scan your phone for political beliefs.

  15. ryanjogden says:

    Concerning that big tech companies have access to a child pornography database.

  16. DocLockJ says:

    Pine Phone and freedom phone is looking a lot nicer right about now.

    Might be a good business model to charge people to root their phone.

  17. axegarden says:

    Anyone else find it odd that a child porn image database is being shared around all the big tech companies?

  18. Feddy_Von_Wigglestein says:

    “Users who feel their account was improperly suspended can appeal to have it reinstated, says Reuters.”

    Riiight, unless you have the “wrong” political opinions. What better way to unperson someone than involving illegal images that everybody agrees are objectionable? Cause Big Tech has such a good track record when it comes to anything involving truth and security….