Forbes: TikTok moderators get to see child abuse footage in workouts

Spread the love

American TikTok moderators from the company Teleperformance are shown child sexual abuse material, or CSAM, as part of their training. That writes business magazine Forbes based on information from former TikTok moderators.

The moderators whom Forbes spoke, worked for the American branch of Teleperformance, a French multinational that moderates posts for social media platforms, among other things. In order to recognize and tag CSAM as such, the employees were shown footage of children being abused during training. In addition, they were given spreadsheets that show hundreds of examples of CSAM and other content prohibited by TikTok. The moderators had to review these spreadsheets daily as a frame of reference, the ex-employees say.

“I have a daughter and I don’t think it’s right for a group of strangers to watch this material,” a former moderator told Forbes, who left Teleperformance in 2020. “I don’t think they should use something like that for training.”

Whitney Turner, also a former Teleperformance moderator, was so shocked by the material that she contacted the FBI. She says she spoke to an FBI agent in June, although the federal security service does not want to confirm this to Forbes. “I was moderating and I was like, This is someone’s son. This is someone’s daughter. These parents don’t know we saved this material. I’m sure the parents would want to destroy TikTok if they knew about this,” Turner said.

In the United States, the distribution and retention of CSAM is prohibited. Companies are required to report the content to the National Center for Missing and Exploited Children (NCMEC). Subsequently, the material must be kept for 90 days, but the number of people who see the content must be kept to a minimum.

TikTok told Forbes that its own moderator training materials do not contain CSAM, but said that “the platform works with third parties who provide their own training”. Teleperformance denies that it shows child abuse images in its training courses and spreadsheets.

You might also like