Report: WhatsApp semi-public group chats are in use for child porn

Spread the love

Semi-public group chats on WhatsApp have been used to distribute child porn to an unknown extent. That claims a report by two Israeli non-profit organizations. WhatsApp acknowledges the problem and suggests that smartphone makers should put a scan in their software.

That scan should cryptographically match images on the phone with known databases of child pornography to determine whether the user is viewing child pornography in WhatsApp, Techcrunch writes. By scanning that on the phone, WhatsApp could maintain end-to-end encryption in its chat app to prevent eavesdropping.

The problem lies with semi-public group chats; group conversations for which an invitation is publicly available online, so that users do not have to know each other, the report shows. The non-profit organizations found a number of those groups whose names barely conceal the fact that users share child pornography in them.

It is unknown exactly how big the problem is. The organization found dozens of such groups, but does not know how many undiscovered groups there are. Users come to the links with invitations on websites or in apps. WhatsApp currently relies on algorithms to scan data such as group names and profile pictures, but that filter is not enough to extract the groups that have now been discovered.

WhatsApp does have restrictive measures for groups. For example, the number of members is limited to a maximum of 256 and it does not offer a search function for group chats. If WhatsApp encounters such groups, Facebook’s subsidiary will take them offline and ban all those users from WhatsApp.

You might also like