Courses
Cloudwards Video Courses New

Cloudwards.net may earn a small commission from some purchases made through our site. However, any earnings do not affect how we review services. Learn more about our editorial integrity and research process.

The iCloud neuralMatch Tool

The iCloud neuralMatch Tool 2024: How Apple’s Child Safety Update Could Endanger Us All

iCloud neuralMatch is a tool that allows Apple to scan photos stored on iCloud and iOS devices. Although its stated purpose is to combat child pornography, it raises serious privacy concerns for Apple users. Keep reading as we delve into the details.

Fergus O'SullivanAleksander Hougen

Written by Fergus O'Sullivan (Writer, Former Chief Editor)

Reviewed by Aleksander Hougen (Co-Chief Editor)

Last Updated: 2024-02-20T11:36:00+00:00

All our content is written fully by humans; we do not publish AI writing. Learn more here.

Following widespread backlash to Apple’s plan to use scanning technology, Apple announced on Sept. 3, 2021, that it would delay the implementation to gather more input.

Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.

– Apple, Sept. 3, 2021

Apple’s been in some hot water recently: earlier this month, the story broke that the Cupertino-based company is set to roll out an update later this year that will scan any and all images uploaded to iCloud Photos by U.S. users and compare them to a database of existing child porn. If there’s a match, the authorities will be alerted.

At first glance, this software — dubbed “neuralMatch” — seems like a pretty solid way to combat child sex abuse, one of the worst crimes out there and reviled across cultures and ages. However, it could open the door to a lot of other abuses, by both businesses and governments. 

Key Takeaways:

  • With the aim of combatting child pornography, Apple announced in August 2021 that a future security update will allow Apple to scan all images uploaded to iCloud and flag any that match known images in the NCMEC’s Child Sexual Abuse Material (CSAM) database.
  • Apple plans to roll out its neuralMatch software later this year in updates to iOS 15, iPadOS 15, watchOS 8 and macOS Monterey.
  • The other parts of the update include adding parental controls for Apple Messages and blocking search results for child pornography.

Let’s take a detailed look at why Apple’s planned update may make children safer, but the rest of us a lot more unsafe.

Achieve Online Privacy and Security Even as a VPN Novice

  • Comprehend the essential role a VPN plays in safeguarding your digital life
  • Gain a deep understanding of how VPNs function under the hood
  • Develop the ability to distinguish fact from fiction in VPN promotions
Please enable JavaScript in your browser to complete this form.
  • 09/07/2021

    Apple announced on Sept. 3, 2021, that it would delay the implementation of the scanning tools following widespread backlash and criticism.

Apple, Child Safety and neuralMatch iCloud Photo Scanning

icloud neuralmatch tool
Apple will start scanning iCloud images to flag child pornography in the iOS 15, iPadOS 15, watchOS 8 and macOS Monterey updates.

The story of Apple’s planned update is one of those rare instances where lines were drawn in the sand from the very start. When it first broke in the Financial Times (our apologies for the paywall), security researchers and privacy advocates all scrambled to condemn it, while governments and child-safety advocates praised it to the heavens.

For example, Matthew Green, an associate professor of computer science at Johns Hopkins University and possibly the first source out there for the news, said on Twitter that:

“This sort of tool can be a boon for finding child pornography in people’s phones. But imagine what it could do in the hands of an authoritarian government?”

On the other hand, in a follow up article, the Financial Times quoted an Indian government official who “welcomed” the software, as it set “a benchmark for other technology companies.” UK minister Sajid Javid remarked on Twitter that he was “delighted to see Apple taking meaningful action to tackle child sexual abuse.”

For a day or two, though, all we heard from Apple was crickets, until the company came out with a detailed description of the planned update. There are three basic components: the software that will scan any images uploaded to iCloud Photos for child porn material, parental controls that will censor any explicit material sent to or from their kids, and new restrictions on Apple Search.

Apple neuralMatch: Scanning iCloud Images

The biggest news is, of course, neuralMatch, though Apple doesn’t call it that in its materials; it was either a temporary name journalists picked up on or Apple scotched it with all the bad publicity.

Whatever the name, once the update is active, Apple will scan the images saved on your iPhone or iPad and check them against a database of child porn. The database is called CSAM by Apple, which stands for Child Sexual Abuse Material and is maintained by the National Center for Missing and Exploited Children (NCMEC), a U.S. organization set up by congress to monitor child porn and missing children cases.

If there’s a match, Apple does a manual check of the image. If it is indeed child porn or something related, Apple alerts the NCMEC and the associated user account is disabled. Should the user feel Apple messed up somehow, they can appeal to get their account reinstated.

What happens next is unclear. For all the detail Apple provides on how it will hash images and create safety vouchers and every other thing, it’s very close-mouthed on what, exactly, will happen to users after Apple sends its report to the NCMEC. We can only assume law enforcement will somehow get involved.

The other parts of the update have received less attention. The simplest one is that Apple Search and Siri will be tweaked so that searches for child porn will show a warning that doing so is “harmful.” What exactly this will look like is unclear, though perhaps the biggest takeaway from this is that there are, apparently, people who still use Apple Search.

As for the parental controls for Messages — Apple’s texting app — these are ways in which parents can protect their kids online by setting up a filter on the Messages app on their phone. Whenever a message that could be considered sexually explicit is sent or received — meaning it works both ways — the image is censored until a parent or guardian has reviewed it.

Apple Security: Staring Down a Slippery Slope

All the above is pretty impressive from a technical standpoint, though we have no idea how well it will work. That said, it’s also terrifying, and it’s no wonder so many privacy advocates are talking about slippery slopes — and authoritarian-leaning voices like the Indian official and a UK minister who have applauded Apple’s moves.

Nobody at Cloudwards is going to argue that child porn isn’t one of the most heinous things around: it’s nasty stuff and the people who make and distribute it belong behind bars. However, Apple’s new surveillance is a two-edged sword if we’ve ever seen one.

Other cloud storage services, like pCloud, have similar filters in place already (as we discussed in this interview with pCloud’s CEO), but they only scan files being uploaded. Apple will have the ability to scan all images on all iPhones. Also, its infrastructure is a lot more pervasive than that of pCloud, as millions of users are already locked in.

As such, this is pretty powerful stuff. The potential for anybody with access to it to scan the content of somebody’s Apple device means that anybody with anything to hide could be in danger of being found out. Sure, this means child pornographers and other criminals, but also human rights activists in totalitarian states like Iran or China (read all about Chinese censorship).

Though the company has promised to never let governments use the technology, it has caved to government demands before, negotiating all kinds of deals with the Chinese government in exchange for access to that immensely lucrative market, for just one example.

Final Thoughts: Apple’s iCloud Photo Scanning

Even if you take Apple at its word that it won’t let dodgy regimes use this particular technology for its own benefit, there’s still another issue — namely, that of privacy. Even though Apple is building all kinds of safeguards into the system, it’s still a breach of its users’ files. Sure, it’s for a good cause, but so are a lot of things.

When you choose the lesser of two evils, the fact is that your choice is still, you know, evil. If there was a magic button to push to remove all child porn from the internet forever, we’d be the first to push it. However, what Apple is proposing is too big an assault on people’s right to keep their business…well, their business, no matter how noble the reason for it may be.

↑ Top