Apple’s Latest Plans : What You Need To Know About CSAM Detection .

Nuha
3 min readAug 11, 2021

--

Apple Inc.

Apple’s new technology is designed to safe guard children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM)

Goals of the System

Apple’s new CSAM detections will scan all iCloud photos — both old and new - to detect images with explicit content regarding minor safety, furthermore this will enable Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC). NCMEC acts as a comprehensive reporting center for CSAM and works in collaboration with law enforcement agencies across the United States. (Apple)

CSAM Detection

The Technological Aspect — How it Works

1 ) Before storing an image on iCloud the system performs a matching process against known CSAM hashes. These hashes have been provided by the NCMEC and other child safety organizations. Using a cryptographic technology called private set intersection, the match is stored without revealing the result. Next, the device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image(Apple).

2)Another technology called threshold secret sharing is used to make sure that the contents of the safety voucher cannot be interpreted by Apple unless the iCloud photos account crosses the limit of known CSAM content.

Apple states that “The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account”(Apple). Meaning that their new system will be incredibly efficient.

3)If the threshold is exceeded, the technology permits Apple to decode the safety voucher and interpret the matched images. Apple employees then manually view the report in order to confirm a match. If a match is identified the user’s account is disabled and a report is sent to the NCMEC.

Role in Messages

The messages app will include new safety features to warn parents and their children if explicit content is being received or sent. When receiving explicit content the photo will be blurred and the child will be warned and presented with helpful resources. Additionally, the parents may be notified if the children decide to view the photo anyway. If a child tries to send explicit content they will be warned and their parents may be notified. Apple will not be reading your messages. An on-device learning mechanism will be used to analyze image attachments and scan them for sexually explicit content. This way Apple wil not get access to messages.

Apple is set to release these features later this year, changes will be available in updated iOS 15, iPadOS 15, watchOS 8, and macOS Monterey (Apple).

Role in Siri and Search

  1. Apple will be updating Siri and search by providing resources to people regarding online safety .
  2. Apple will provide Siri with tools for intervention. When individuals search for CSAM related content, they will be provided with information on why the topic is dangerous and unhealthy.

Final Thoughts

Apple Latops

I think this technology is quite beneficial . In addition, not so surprisingly Apple published this news quite some time before the technology release date, probably because they wanted to to see how the public would react to this.

What are your thoughts on this? Do you think Apple will be breaching the privacy of its 193 million users with this new update? Feel free to share your opinions in the comments below!

All information has been derived from apple.com . If any information in this article in found to be false or misleading feel free to leave a note in the comments.

--

--

Nuha

Someone who loves to read, learn and write. Knowledge has no limits. Follow me on my writing journey through the arts and science!