WhatsApp isn’t going to scan photos for child abuse like Apple

Spread the love

WhatsApp will not adopt Apple’s method of scanning users’ private photos for child abuse images. The director of the WhatsApp division within Facebook has spoken out against Apple’s method of scanning photos.

WhatsApp director Will Cathcart mentions the system a ‘setback to privacy’. “This is the wrong approach and a setback to the privacy of people around the world. (…) We have had computers for decades and there has never been a mandate to scan the private files on every desktop, laptop and phone looking for content that’s illegal. That’s not how software works in the free world.” Cathcart is particularly afraid that countries where iPhones are on the shelves will eventually be able to determine for themselves what the system will scan for.

Apple unveiled the method last week as part of several measures to combat the proliferation of child abuse imagery. Apple obtains a database of hashes of child abuse photos from organizations dedicated to protecting children, such as the National Center for Missing and Exploited Children.

That database then goes encrypted to all Macs, iPads, and iPhones, where the devices locally compare the hashes to hashes of local photos that will go to iCloud Photos backup service. If the number of matches exceeds a threshold, the device sends a notification to an Apple server. The feature will be operational in the US sometime this year.

The WhatsApp director is not the first critic of the system. Including civil rights organizations Electronic Frontier Foundation and Center for Democracy & Technology have already spoken out against Apple’s system.

.fb-background-color { background: #ffffff !important; } .fb_iframe_widget_fluid_desktop iframe { width: 100% !important; }
AppleElectronicFacebookiCloudiPhones
Share