A former TikTok moderator is suing TikTok and its parent company ByteDance Inc. for reportedly failing to protect her mental healthÂ after sheÂ had to watch hours of traumatic videos.
In the proposed class-action lawsuit, Candie Frazier claimed that she had screened videos involving freakish cannibalism, crushed heads, school shootings, suicides, and even a fatal fall from a building, Bloomberg reported.Â
TikTokâ€™s 10,000 content moderators are exposed to a constant stream of child pornography, rapes, beheadings and animal mutilation, according to the lawsuit filed in a Los Angeles court last week.
â€˜Plaintiff has trouble sleeping and when she does sleep, she has horrific nightmares,â€™ said the complaint.Â
Frazier, who worked for aÂ third-party contracting firm, stated that TikTok moderators were required to review hundreds of videos a day.
She also described how moderators are made to work 12-hour shifts with only one hour off for lunch and two 15-minute breaks.
During these long shifts, moderators often have to watch three to ten videos at once, reviewing each video for only 25 seconds each. ByteDance keeps a close eye on the moderatorsâ€™ performance,Â the VergeÂ reported, and â€˜heavily punishes any time taken away from watching graphic videosâ€™.
Without measures in place to protect her mental health on the job, Frazier claimed that she has developed panic attacks and depression as well as symptoms associated with anxiety and post-traumatic stress disorder (PTSD).
Industry standards to protect content moderators include offering them frequent breaks and psychological support which the lawsuit says TikTok did not do. FrazierÂ also claimed that the social media giant had no technical safeguards in place such as blurring or reducing the resolution of disturbing videos that moderators have to watch.
With the class action suit, Frazier is hoping to get TikTok to pay her and other content moderators for the psychological injuries they have suffered. She also wants the court to order the company to set up a medical fund for content moderators.
A TikTok spokesperson told Metro that they were not able to comment on ongoing litigation. However, TikTok claims that it works hard â€˜to promote a caring working environment for our employees and contractorsâ€™.
â€˜Our Safety team partners with third-party firms on the critical work of helping to protect the TikTok platform and community, and we continue to expand on a range of wellness services so that moderators feel supported mentally and emotionally,â€™ said the TikTok spokesperson.
TikTok was part of a group of social media companies including Facebook and YouTube that developed guidelines for helping moderators cope with the images of child abuse that their jobs required them to view, according to the complaint.
However, TikTok failed to provide psychological support and limit moderatorsâ€™ shifts to four hours, according to the lawsuit.
Social media companies have beenÂ criticised in the past for not providing adequate mental health support to its moderators,Â given the jobâ€™s psychological hazard. In 2020, Facebook agreed to pay $52 million (Â£42 million) to content moderators as compensation for PTSD developed on the job.