Susan Wojcicki, CEO of YouTube.
Michael Newberg | CNBC
A former YouTube moderator sued YouTube on Monday, accusing it of failing to protect workers who have to catch and remove violent videos posted to the site.
The suit says the plaintiff was required to watch murders, abortions, child rape, animal mutilation, and suicides. YouTube parent company Google faces increasing pressure to control content spanning violence and misinformation — particularly as it approaches the 2020 U.S. election and federal investigations.
YouTube training for contractors includes a video of a “smashed open skull with people eating from it,” among other violent videos, the lawsuit alleges. It’s seeking compensation and medical treatment for the plaintiff and others who were who have experienced similar trauma.
The plaintiff, who’s referred to as “Jane Doe,” worked as a YouTube content moderator for staffing contracting firm Collabera from 2018 to 2019 and experienced nightmares, panic attacks and inability to attend crowded areas as a result of the violent content she viewed while working for the company, the lawsuit says.
YouTube’s “Wellness Coaches” weren’t available for people who worked evening shifts and were not licensed to provide professional medical guidance, the suit says. It also alleges moderators had to pay for their own medical treatment when they sought professional help.
Neither YouTube nor Collabera immediately responded to request for comment.
The plaintiff stated that during a company training, the company showed a video of a woman who was kidnapped and beheaded by a cartel, bestiality, suicides, children being raped as well as births and abortions, the lawsuit alleges. “As the example was being presented, Content Moderators were told that they could step out of the room but Content Moderators were concerned that leaving the room would mean they might lose their job because at the end of the training, new Content Moderators were required to pass a test.”
The suit says many content moderators remain in position for less than a year and that the company is “chronically understaffed,” so moderators end up working overtime and exceeding the company’s recommended four-hour daily viewing limit. Despite the demands of the job, moderators had little margin for error, the suit states.
The company expects each moderator to review between 100 and 300 pieces of video content each day with an “error rate” of two to five percent, the suit claims. The companies also control and monitor how the videos are displayed to moderators: whether in full-screen versus thumbnails, blurred and how quickly they watch in sequence.
The suit comes as moderators for social media companies speak out on the toll the job takes on their mental health. YouTube has thousands of content moderators and most work for third-party vendors including Collabera, Vaco and Accenture. Joseph Saveri Law Firm, a San Francisco-based firm representing the plaintiffs, filed a similar lawsuit against Facebook that resulted in $52 million settlement in May.
It also comes as Google-owned YouTube has reportedly reverted back to humans to find and delete content after it relied on computers to automatically sift through videos during the pandemic. It switched because computers were censoring too many videos that didn’t violate any rules.