Apple rolls out feature to scan messages to kids for nudes

Spread the love

Apple implemented a feature to scan messages addressed to children for nude images in the second beta of iOS 15.2. It seems that the release of this feature is getting closer.

The feature is opt-in, writes 9to5Mac. If an iPhone or iPad is linked to a child’s iCloud account under 13, Family Sharing allows parents to enable the option to scan incoming messages for nude images. The software then blurs it automatically and displays a warning.

The monitoring and analysis of images takes place on the device itself using machine learning. This keeps the encryption of iMessage intact. The feature does not work with other messaging apps, such as WhatsApp or Snapchat. The feature appears to only be available in the United States for now.

The scan feature is different from scanning images that users upload to iCloud. Apple announced both measures at the same time this summer. The intention was that iPhones and iPads would automatically scan images based on databases of known images of child abuse. That position has been postponed after criticism.

You might also like