Facebook employees rate nude photos in revenge porn project

Spread the love

In a new program that Facebook has set up to combat revenge porn, employees of the company themselves, not algorithms, assess the nude photos that users have to upload. A hash is then generated.

The human assessment of the uncensored nude photos, which users voluntarily upload for the project, is necessary to determine whether other users’ posts contain revenge porn. A Facebook spokesperson told The Daily Beast. Among other things, it should prevent legitimate photos from being deleted.

Earlier this week, ABC reported, among other things, about a test by Facebook and an Australian government agency, which is trying to prevent the distribution of nude photos or sexually explicit images against someone’s will. To do this, users would need to contact the Australian government’s e-safety committee and they would be asked to send the intimate photos that may leak to themselves via Facebook Messenger.

Facebook would then hash the file and use that hash to possibly match the images labeled as revenge porn, to block the uploading of the photos by a third party. According to e-Safety committee member Juli Inman Grant, Facebook would not store the image further and would use artificial intelligence in the matching. Facebook’s head of its security division, Antigone Davis, spoke of cutting-edge technology.

Facebook tells The Daily Beast that the material is kept “for a certain period of time” for a small group of employees to check whether the policy is applied correctly, but that blurring is applied at this step. For the first assessment, the photos are viewed in their original condition.

You might also like