icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
22 Sep, 2020 16:40

YouTube content moderator sues platform over PTSD & depression, hiring lawyers who got Facebook to cough up $52mn

YouTube content moderator sues platform over PTSD & depression, hiring lawyers who got Facebook to cough up $52mn

A former YouTube mod has sued the Google-owned platform, claiming work exposure to child abuse, beheadings, and other vile content led to depression and PTSD. Her lawyers previously forced a large settlement out of Facebook.

The unnamed ex-YouTube moderator is demanding compensation and treatment for the trauma she said she suffered, claiming in a lawsuit filed on Monday that her work left her with depression and PTSD symptoms. According to the filing, she now experiences panic attacks, nightmares, and crippling anxiety, causing her to shun crowds and children and fear one day having kids herself. 

Alleged "chronic understaffing" at YouTube forced her and her coworkers to review anywhere from 100 to 300 pieces of graphic content – murders, suicides, bestiality, torture, and so on – over a four-hour period, the class-action lawsuit charges. Expected to accurately grade the content with an error rate of under five percent, mods couldn't just skip or look away from unpleasant videos.

YouTube is accused of violating California law by failing to safeguard employees' mental health and provide a "safe workplace" for the plaintiff and her colleagues. The suit also calls for the company to set up a medical monitoring program to screen content moderators for symptoms of mental illness, diagnose sufferers, and provide treatment – all on YouTube's dime.

Also on rt.com Facebook paying $52 million settlement to moderators who claim they developed PTSD through work

Joseph Saveri Law Firm, which filed the case on behalf of the anonymous content reviewer and her colleagues, has some experience taking multi-billion-dollar tech firms to the cleaners on behalf of traumatized content moderators. A 2018 lawsuit they filed for Facebook mods ended with the tech behemoth coughing up a $52 million settlement earlier this year.

The YouTube suit charges that the company didn't adequately warn prospective moderators of the depths of depravity they'd be combing through, merely informing them they might be required to view "graphic" content. Nowhere is the possible mental health impact of repeated exposure to such content described, and workers sign a non-disclosure agreement before starting their jobs, meaning they can't discuss the details with a typical therapist.

While one might think being exposed to non-stop violence and abuse would be enough to allege trauma, the lawyers took care to point out that moderators also suffered "repeated exposure to conspiracy theories, fringe beliefs, and political disinformation," implying some equivalency between "false information about participating in the census" or "manipulated/doctored videos of elected officials" and clips of animal torture or genocide.

In oddly specific language, the suit complains that YouTube doesn't ease its reviewers into the job "through controlled exposure with a seasoned team member followed by counseling sessions." Nor does it "alter the resolution, audio, size, and color of trauma-inducing images and videos" – though the platform might argue that defeats the purpose of having human moderators review it. "Human judgment is critical to making contextualized decisions on content," according to YouTube's own guidelines.

Also on rt.com No joke zone: Twitter mob CANCELS Twenty One Pilots’ frontman Tyler Joseph for visual pun on ‘using platforms’ for social issues

While wellness coaches are available on staff, they lack medical expertise and aren't around during the night shift, according to the suit. The anonymous moderator claims the first wellness coach she sought out advised her to take illegal drugs, while another suggested a distressed mod "trust in God."

Content moderators work on contract, meaning in normal circumstances YouTube isn't liable for their mental health issues. However, the lawsuit claims moderating videos on the platform is "abnormally dangerous" and accuses YouTube of providing "unsafe equipment," circumstances which would extend liability to the Google subsidiary. YouTube is also skewered in the lawsuit text for coming up with "industry standards for mitigating the harm to content moderators" to which it then failed to adhere.

Ultimately, however, YouTube moderators who sign on to the class action might be putting themselves out of a job. The platform has already moved toward algorithmic content moderation due to the Covid-19 pandemic, warning users their content would increasingly be evaluated by automatic processes. It's not clear to what degree human mods have already been replaced by algorithms, but Facebook and Twitter issued similar statements earlier this year. With pandemic restrictions on workplaces unlikely to disappear anytime soon, content mods may find they have all the time they need to recover from their exposure to traumatic images – and then some.

Like this story? Share it with a friend!

Podcasts
0:00
23:13
0:00
25:0