Technology

Former YouTube content moderator describes horrors of the job in new lawsuit

Susan Wojcicki, CEO of YouTube.

Michael Newberg | CNBC

A former YouTube moderator sued YouTube on Monday, accusing it of failing to protect workers who have to catch and remove violent videos posted to the site. 

The suit says the plaintiff was required to watch murders, abortions, child rape, animal mutilation and suicides. YouTube parent company Google faces increasing pressure to control content spanning violence and misinformation — particularly as it approaches the 2020 U.S. election and antitrust investigations from state attorneys general, the Department of Justice and Congress.

The plaintiff, who’s referred to as “Jane Doe,” worked as a YouTube content moderator for staffing contracting firm Collabera from 2018 to 2019 and experienced nightmares, panic attacks and inability to attend crowded areas as a result of the violent content she viewed while working for the company, the lawsuit says.

YouTube’s “Wellness Coaches” weren’t available for people who worked evening shifts and were not licensed to provide professional medical guidance, the suit says. It also alleges moderators had to pay for their own medical treatment when they sought professional help.

Neither YouTube nor Collabera responded to requests for comment.

The suit says many content moderators remain in position for less than a year and that the company is “chronically understaffed,” so moderators end up working overtime and exceeding the company’s recommended four-hour daily viewing limit. Despite the demands of the job, moderators had little margin for error, the suit states.

The company expects each moderator to review between 100 and 300 pieces of video content each day with an “error rate” of 2% to 5%, the suit claims. The companies also control and monitor how the videos are displayed to moderators: whether in full-screen versus thumbnails, blurred or how quickly they moderators watch in sequence. 

The suit comes as moderators for social media companies speak out on the toll the job takes on their mental health. YouTube has thousands of content moderators and most work for third-party vendors including Collabera, Vaco and Accenture. Joseph Saveri Law Firm, a San Francisco-based firm representing the plaintiffs, filed a similar lawsuit against Facebook that resulted in $52 million settlement in May.

It shows YouTube may need to provide more resources for the people who need to remove videos that violate the rules. YouTube has reportedly reverted back to relying on humans to find and delete content after it used computers to automatically sift through videos during the pandemic. It switched back to human content moderators because computers were censoring too many videos that didn’t violate any rules.

View Article Origin Here

Related Articles

Back to top button