from the not-how-to-do-things dept
WTF, TikTok? Time and time again we see that TikTok does weird things regarding content moderation. More than most firms in the space, TikTok often does things that suggest that it hasn’t bothered to speak to other experts in trust and safety, and they want to reinvent the wheel… but with terrible, terrible instincts. Apparently that applies to the company’s third party moderators as well.
Forbes has an astonishing piece claiming that a key third party company that TikTok uses for moderation, Telepresence, had, as part of its moderator training, actual images of child sexual abuse material (CSAM) shown to trainees. This is so unbelievably stupid that I still almost don’t believe it could possibly be true. Possession of CSAM is a strict liability situation. There are rules for how online service providers handle any CSAM they come across, involving notifying the National Center for Missing & Exploited Children (NCMEC) via its CyberTipline. 18 U.S. Code § 2258A has the details of how a provider that discovers CSAM must handle that content — sending a report to NCMEC, and then preserving the content as evidence for law enforcement.
But also, making damn sure that the content is kept very, very locked up:
A provider preserving materials under this section shall maintain the materials in a secure location and take appropriate steps to limit access by agents or employees of the service to the materials to that access necessary to comply with the requirements of this subsection.
Nowhere in the law do I see anything even approximately suggesting that a company can not just hang onto this material in a non-secure manner, but then to show the content to employees as part of training. I mean… I just can’t. How did anyone think this made sense?
I mean, sure, you can concoct a thought process chain that gets you there: we need to train employees, and the best way to train employees is to show them examples on which to train them. But, holy shit, how does no one realize way earlier that YOU DON’T DO THAT with CSAM?! I don’t see how it’s even possible that people didn’t realize how problematic this was. I mean, this paragraph just has me screaming out loud, because how does this happen?
Whitney Turner, who worked for Teleperformance’s TikTok program in El Paso for over a year and departed in 2021, also recalled being shown sexually exploitative imagery of kids as part of her training. Whitney was given access to a shared spreadsheet that she and other former employees told Forbes is filled with material determined to be violative of TikTok’s community guidelines, including hundreds of images of children who were naked or being abused. Former moderators said the document, called the “DRR,” short for Daily Required Reading, was widely accessible to employees at Teleperformance and TikTok as recently as this summer. While some moderators working in unrelated functions were restricted from viewing this material, sources told Forbes that hundreds of people across both companies had free access to the document. The DRR and other training materials were stored in Lark, internal workplace software developed by TikTok’s China-based parent company, ByteDance.
The excuses given are equally unbelievable..
Teleperformance’s Global President of Trust & Safety Akash Pugalia told Forbes the company does not use videos featuring explicit content of child abuse in training, and said it does not store such material in its “calibration tools,” but would not clarify what those tools are or what they do. He declined to answer a detailed list of other questions regarding how many people have access to child sexual abuse material through the DRR and how Teleperformance safeguards this imagery.
The Forbes piece has lots of crazy details, including the fact that tons of people had access to this content, and other things: like one moderator who claims her job didn’t even involve CSAM content, and she never encountered any on the job other than when it was showed to her as part of her training.
Honestly, this feels like the kind of thing that could, and perhaps should, lead to criminal charges against someone.
Filed Under: content mdoeration, csam, training
Companies: telepresence, tiktok
Source by www.techdirt.com