Apple canceled child abuse detection tool due to ‘feature creep’
Apple has canceled the tool it developed to detect child abuse images in iCloud, citing concerns about feature creep. Apple feared that certain countries would demand that it be used to commit censorship.
Apple now provides a much more extensive argument. For example, Apple was afraid that criminals could abuse the system. “It could also lead to a slippery slope of unintended consequences. For example, scanning for one type of content opens the door to bulk surveillance and could create a desire to search other encrypted messaging systems for content types, such as images, videos, text or audio and content categories. How can users be sure that a tool for one type of surveillance has not been reconfigured to surveil other content, such as political activity or religious persecution?”
That would have negative consequences for freedom of expression and democracy, Apple argues. “Additionally, designing this technology for one government could require applications for other countries for new data types. Scanning systems are also not infallible and there is documented evidence from other platforms that innocent parties have been swept up in dystopian dragnets that have turned them into victims as they have done nothing.”
Apple explained the decision to stop developing the tool in a letter to Heat Initiative that Wired put online. The iPhone maker announced the tool to scan iPhones locally for known child abuse images in 2021, but canceled it at the end of last year. At the time, the explanation was relatively limited.
Communication Safety