Apple announced it is delaying new child safety features in iOS and related services, following concerns from multiple customers and advocacy groups over privacy. Announced a month ago, the proposed changes would see the company check customer photos and videos stored in Apple's iCloud for potential child abuse images, which would then be reported to authorities in the US.
In a short statement to US media, Apple said it "decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features". Further details were not provided. The changes were expected to take place later this year in the next iOS update.
When Apple first announced the system, it already provided a detailed presentation of how the image checking would work. The company said it designed the system with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known child sexual abuse material (CSAM) image hashes provided by child safety organizations. The threshold would be set to provide an extremely high level of accuracy and ensure less than a one in one trillion chance per year of incorrectly flagging a given account, the company said.
Apple itself would only have access to user photos if these contain CSAM and even then only those images.
Privacy groups are concerned that once Apple sets up such a system, it could be forced to expand it for wider use by government authorities. The surveillance capacity could be used to restrict free speech and threaten the privacy and security of Apple users around the world. A group of over 90 advocacy groups from around the world, led by the Center for Democracy & Technology, sent Apple CEO Tim Cook a letter calling for the company to abandon the proposal.
Other measures planned by Apple include warnings in the Messages app to children and their parents when receiving or sending sexually explicit photos and expanding guidance in Siri and Search to help people avoid and report offensive material. Siri and Search would also be updated to intervene when users perform searches for queries related to CSAM.