icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
26 Dec, 2021 02:47

TikTok video moderator sues over ‘mental’ damage

TikTok video moderator sues over ‘mental’ damage

A content moderator for TikTok is suing the company, alleging that it failed to protect her mental health while she was forced to watch graphic videos including suicide, murder and cannibalism.

Candie Frazier – who allegedly worked 12 hours a day monitoring TikTok videos for the contracting company Telus International – filed a lawsuit against TikTok and its parent company ByteDance claiming she now “has trouble sleeping” and that “when she does sleep, she has horrific nightmares” due to the disturbing videos.

According to Bloomberg, the videos Frazier was exposed to at work include clips of “freakish cannibalism, crushed heads, school shootings, suicides, and even a fatal fall from a building, complete with audio.”

“Due to the sheer volume of content, content moderators are permitted no more than 25 seconds per video, and simultaneously view three to ten videos at the same time,” the lawsuit alleged, claiming moderators were only allowed to escape the content with two 15-minute breaks and an hour for lunch.

In a statement, TikTok said that it seeks “to promote a caring working environment for our employees and contractors.”

“Our safety team partners with third party firms on the critical work of helping to protect the TikTok platform and community, and we continue to expand on a range of wellness services so that moderators feel supported mentally and emotionally,” a spokesperson claimed.

It’s not the first time a tech company has been sued for exposing content moderators to graphic content. In September, a former YouTube content moderator sued the Google-owned platform, alleging she had been left with symptoms of depression and PTSD after being made to watch videos of murder, suicide, bestiality, and torture. The moderator said “chronic understaffing” meant she was forced to watch between 100 and 300 videos in just a four-hour period.

In May, it was also reported that Facebook would be paying $52 million to moderators who allegedly developed PTSD from their work.