Big Tech /

Content Moderator Sues TikTok, Alleging Trauma After ‘Constant’ Exposure to Violent Videos

The plaintiff says she had to watch extreme content including rapes, beheadings and suicides for up to 12 hours a day

TikTok is being sued by one of its content moderators who claims she suffered psychological trauma after viewing hours of video of sexual assaults, suicides, killings and other violent scenes.

Candie Frazier says she and other moderators spent up to 12 hours a day reviewing “Extreme and graphic violence” in order to prevent the content from being viewed by the platform’s users. Images included “genocide in Myanmar, mass shootings, children being raped, and animals being mutilated,” says the lawsuit.

The lawsuit was filed in California Central District Court against TikTok and ByteDance, its Chinese parent company.

Frazier is not employed by TikTok but rather by Telus International, which provides workers to other businesses. Her lawsuit contends, however, that “ByteDance and TikTok control the means and manner in which content moderation occurred.”

Because of the exposure to the graphic videos, Frazier alleges she developed “significant psychological trauma including anxiety, depression, and posttraumatic stress disorder.” The lawsuit argues that TikTok violated California labor laws because it failed to provide a “safe work environment.”

The legal action is seeking compensation for Frazier and other moderators who were exposed to the content as well as requiring ByteDance and TikTok to provide mental health support and treatment. They also want the company to introduce more technical safeguards, like blurring or reducing the video resolution.

According to the lawsuit, the moderators watch 10 videos simultaneously and are pushed by TikTok software “to review videos faster and faster.”  Moderators are allowed to take two 15-minutes breaks and a one-hour lunch break during their 12-hour shifts. ByteDance allegedly monitors workers’ performance closely and “heavily punishes any time taken away from watching graphic videos.”

Other social media platforms have faced similar allegations.

Last year, Facebook and its parent company, now known as Meta, agreed to a settlement totaling $52 million with thousands of its moderators. In their 2018 lawsuit, the moderators claimed Facebook failed to protect them from traumatic content.

The moderators were represented by Joseph Saveri Law Firm, which also filed Frazeri’s lawsuit.

TikTok spokeswoman Hilary McQuaide declined to comment on “ongoing litigation” in a statement to The Verge. The company did say in a statement that it would “continue to expand on a range of wellness services so that moderators feel supported mentally and emotionally.”

“We strive to promote a caring working environment for our employees and our contractors,” said TikTok.

Frazeri’s employer, Telus International, said via a spokesperson that it was “proud of the valuable work our teams perform to support a positive online environment” and noted that it had a “robust resiliency and mental health program in place.”

*For corrections please email [email protected]*