icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
21 Jun, 2019 11:19

Transparency is the solution? RT panel talks disturbing report on Facebook content moderators

Transparency is the solution? RT panel talks disturbing report on Facebook content moderators

A media report revealed that Facebook content moderators have to work under constant stress and suffer from the toll the grueling job takes on them. An RT panel looks at whether external oversight of the company is the solution.

An article by the Verge provided disturbingly grim insight into the job of content moderators working for Cognizant, a Facebook contractor, in Tampa, Florida.

The workers told the magazine that they have to go through hundreds of graphic videos each day for just $15 an hour, which takes a heavy toll on their psychological health. Some suffer from post-traumatic stress disorder, while others feel growing anxiety and depression. One Cognizant worker, a Coast Guard veteran, died right at his desk after the “unworldly” amount of stress the job caused him.

Also on rt.com Sex, drugs, and flat earth: Facebook’s content-watch contractors cope with the dregs of the internet

In addition to that, they said moderators have to share filthy, rarely looked-after desks, and work in an office that has only one bathroom, which is dirty and is shared by 800 employees.

Analysts told RT that something needs to be done immediately. One option is “greater transparency to allow an external body to go into Facebook and audit its processes in terms of moderation,” Yair Cohen, a UK-based social media lawyer, believes.

Unfortunately, it doesn’t look as if Facebook is willing at this stage to allow any of this degree of transparency or any external audit to take place.

Bill Mew, a privacy activist and technology expert, says there is a certain set of technologies that could help relieve moderators of the need to deal with graphic content.

“I think we’re already there with [moderation of] text based and, to extent, voice based posts,” he said.

However, digital algorithms are less helpful when dealing with video content, because “automated bots can’t really feel the difference between the dog and cat and the terrorist.”

Also on rt.com Facebook accused of promoting terrorism with auto-generated content

Mew added: “I don’t think the technology is sophisticated enough to be the solution quite yet.”

Facebook has come under fire in recent years over its targeted censorship and political bias. There were revelations about the social media giant exposing user data to other companies. Also this year, it emerged that the corporation tracks former employees it considers a threat.

Think your friends would be interested? Share this story!

Podcasts
0:00
28:37
0:00
26:42