Apple has confirmed it will start scanning photos in iCloud for child sexual abuse material from the next iOS update. If offensive material is discovered, the user will be reported to the National Center for Missing and Exploited Children, which works with law enforcement authorities in the US.
Apple released a statement on the plans following media reports that it would start checking iCloud photos. The change will take effect later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey and is part of a wider effort by Apple to combat child sexual abuse material (CSAM) online.
Apple said it designed the system with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account, the company said.
Apple said it would only have access to user photos if these contain CSAM and even then only those images.
Warnings in Messages app, Siri
Other measures planned include warnings in the Messages app to children and their parents when receiving or sending sexually explicit photos. When receiving this type of content, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo. As an additional precaution, the child can also be told that, to make sure they are safe, their parents will get a message if they do view it.
Similar protections are available if a child attempts to send sexually explicit photos. The child will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it.
Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit. The feature is designed so that Apple does not get access to the messages.
Apple is also expanding guidance in Siri and Search to help people avoid and report offensive material. Users who ask Siri how they can report CSAM or child exploitation will be pointed to resources for where and how to file a report.
Siri and Search are also being updated to intervene when users perform searches for queries related to CSAM. These interventions will explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue.