Apple is delaying the introduction of child abuse detection techniques for iOS users’ images until a date yet to be determined. The company reports doing this based on feedback from users and researchers, among others.
Apple will be gathering more input in the coming months for its plans to detect child abuse using iOS, and also plans to make further improvements ahead of the release. In the short statement addressed to American media such as Apple Insider, Apple does speak of ‘critically important child protection functions’.
The Cupertino company says it has decided to take more time based on feedback from users, advocacy groups, researchers and others. The company does not specify a new target date for implementation. Apple initially planned to implement the feature this year.
Last month, Apple announced plans to add functionality to iOS and iPadOS to check photos US users put on iCloud for child abuse. That would be done with a mechanism that would protect the privacy of innocent customers and present a very low risk of false positives.
The plans received strong criticism from security researchers, privacy activists and legal experts, among others, who pointed to the long-term risks to privacy and security. Last week, civil rights movement Electronic Frontier Foundation petitioned Apple not to go through with the plans. It was signed by 25,000 people.
The foundation of Apple Private Set Intersection, the technology the company plans to use to scan images