Apple’s Newest Update: the CSAM Detection Technology

Rating: 0 based on 0 Ratings
  By Pia Lopez
Apple’s Newest Update: the CSAM Detection Technology

The newest Apple update rolled some interesting questions on where does the line should be drawn about content stored in our digital devices. Companies like Apple have been fighting hard to protect children from despicable acts as pornography, hence the vital importance of this brand new algorithm to scan content on the device itself rather than cloud-stored photos.

Why should you be aware of this technology as a photographer? In this post, we’ll explain the importance of updating your device to the latest OS version to help Apple on this mission and bring some light on which content gets scanned.

What’s CSAM?

Before we even start to discuss this new technology, the acronym CSAM stands for Child Sexual Abuse Material. Yes, smartphones have made it way easier to perform these horrid practices, hence the reason why tech giants explored solutions to help in this fight. The concept behind this technology is for Apple to securely scan your smartphone’s photo storage in search of content matching the criteria in an Apple-crafted database. Said explicit content will be then reported to the NCMEC (National Center for Missing and Exploited Children), the organization that shall work in collaboration with the US law enforcement departments to disband minor sexual abuse networks.

Apple has put much thought into this development to the point that even matching results in Search and Siri will be reported as well. All whilst encrypting data to prevent data leaks, and also not allowing the user to sync CSAM content to iCloud.

Image courtesy of Apple

What about those who don’t live in the USA?

It’s a very good question considering that the authority to perform this extremely detailed search seems to be limited to reporting the offending data to US authorities. Whilst Apple hasn’t released statements about the implementation of this service worldwide, we can assume Interpol shall be involved, hence the international reach of this campaign.

We should also keep in mind that this technology isn’t activated by default, as it’s part of the Family Protection configs intended to prevent children misusage of sensitive data (and managed by parents).

On Messages and other apps

One of the key elements of this Apple update is the fact that, once it’s enabled, the CSAM Update prevents minors to send sensitive images via the Message app by triggering a series of steps:

  1. The photo is presented as blurred (greyed out in fact) to the receiver.
  2. A pop-up opens warning that it’s not necessary to open said content, offering tools to help them handle the situation. If the photo is opened, the parents get a message reporting their child opened sensitive material on their phone.
  3. If children are intending to send sensitive photos to others, the parents get a message notification about said action before the photo is sent. They also get a notification if the photo is sent.

It’s worth noting that the actions mentioned above apply to both sending and receiving CSAM content.

Image courtesy of Apple

WhatsApp and other messaging apps operate in different parameters than Apple for content distribution. However, since this update came live they will likely limit the distribution of CSAM content via third-party apps.

Will this update affect other data?

And here is what photographers have been asking since the release. What about sensitive data that doesn’t involve minors? Or what about your data in general?

Believe it or not, this update does not give Apple superpowers to go through your content and get “insider data” about your intimate conversations. The update is clear, and the mechanism is targeted to only scan what can be interpreted as CSAM content. Apple is extremely clear about the process as shown below:

Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.

Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.

For this very reason, only CSAM content gets scanned. No nude photos, no artistic nudes, no credit card data, or any other kind of sensitive information you might be fearing. iCloud also enforced its encryption system to prevent data leaks on iCloud Photos. This was done keeping in mind sexting practices but also the growing number of professionals that use smartphones as a work tool and don’t wish to disclose their content unless the customer allows the distribution.


What does this update mean for us as photographers? First, it’s important to acknowledge the hard work of fighting against child pornography. This update sets a precedent for other brands to implement similar technology on their devices.

As a bonus element, we get clear hints on how data is handled when uploaded to the Cloud. Apple does not benefit themselves in spying on their customer’s data, so your content is at a safe place. Next, encryption enforcement also protects our data against hackers or any kind of unauthorized access to our photo storage. We can rely on the cloud as a safe medium for backups, removing the need to own physical storage mediums (and preventing data loss tragedies like this one).

All in all, this cutting-edge technology sets the building blocks to better data usage. If you were concerned about your work as a boudoir photographer rest assured that your customer’s data won’t be affected as this scan process only runs for children’s photos. And for parents, it’s a safer medium to prevent family crises about sensitive photos disclosed over the internet, but they won’t prevent the typical teenage-embarrassing baby photos from existence 🙂

Rating: 0 based on 0 Ratings
The following two tabs change content below.

Pia Lopez

Pia Lopez is a self-taught photographer, graphic designer and ArchViz artist. As Content Director of, her work is driven by her two biggest passions: technology and art.

Comments (0)

There are no comments yet.