Apple promises to use photo scan only to detect child abuse

Spread the love

Apple promises to use the photo scanning feature announced last week on users’ devices only for the purpose of detecting known child abuse material. If governments want to track down other things, Apple will refuse.

In its faq about the new technologies, Apple wrote on Monday that it has been refusing government requests for years if they compromise user privacy: “Let’s be clear: this technology is limited to detecting child abuse images in iCloud and we don’t will not grant requests from governments to expand that.”

The fear that Apple will eventually expand the feature to things other than child abuse images is one of the biggest criticisms of civil rights organizations such as the Electronic Frontier Foundation. The Verge points out that Apple has in the past complied with requests from governments. For example, Facetime is not available in various countries, because it is end-to-end encrypted and governments do not allow it. The company has also allowed iCloud data of Chinese users to be stored in a data center managed by the Chinese government.

Apple unveiled the method last week as part of several measures to curb the distribution of child abuse images. Apple is getting a database of hashes of child abuse photos from child protection organizations, such as the National Center for Missing and Exploited Children.

Then that database goes encrypted to all Macs, iPads and iPhones, where the devices locally compare the hashes with hashes of local photos that will go to backup service iCloud Photos. When the number of matches exceeds a threshold, the device sends a notification to an Apple server. The feature will work in the US during the course of this year.

iPhone 12 Mini

AppleBackupCivilElectronicFaceTimeGovernmentGovernmentsiCloudiPadsiPhoneiPhonesMaterialPhotoPrivacyProtectionScanningtechnologiesVerge