Faceblock: German newspaper publishes 'hate speech' rules
Facebook’s community guidelines stipulate that content posted may not contain direct threats, self-injury, dangerous organizations, bullying and harassment, attacks on public figures, criminal activity, sexual violence and exploitation or regulated goods. But what constitutes graphic subject matter or bullying is not always clear.
The documents offer some insights into how content moderators are taught to judge a post when the content frequently falls into subjective territory. But that does not do much for users who are banned, sometimes as the result of a joke.
In order to clarify this, Facebook’s documents contain bizarre criteria for protected groups. The website’s banning of verbal attacks on protected categories sounds good at face value. Comments attacking people for their sex, religious affiliation, country of origin, race, appearance, sexual orientation and similar markers make them protected.
The documents offer numerous examples about what is allowed and what isn’t. “Tall girls are just freaks!” is an example of an acceptable post but “Refugees? More like rape-fugees” would be deleted. In the documents, Facebook explains that “Protected category + attack = hate speech.”
Each characteristic is considered as a “protected category” but conversely, there are unprotected categories that do not receive this protection. For example, back in November comedian and Twitter personality Nick Mullen was placed on a 30-day ban after posting “kill all white women” in response to a Huffington Post article.
lol when u make fun of that shitty brogressives article: pic.twitter.com/5IMdn7qZBt— extremely online guy (@nickmullen) November 18, 2016
Despite Mullen being a working comedian, the post was considered hate speech, because he attacked two protected categories. However, had Mullen written “kill all retirees,” then he may have been let off the hook.
What qualifies as bullying on Facebook has been a subject of debate for a long time. While the guideline does not define what bullying is, it does explain that ranking private individuals by their looks is considered bullying. In addition, ranking private individuals on their personality is also bullying.
For self-destructive behavior, Facebook asks its content moderators to use context to determine whether a post should be removed. Posts that encourage self-harm should be removed, but a picture that is considered a “cry for help” is allowed to stay up so that their friends can see it.
Facebook has very strict rules for handling posts that deal with bodily functions. The guideline uses a picture of a man who appears to be straining on a toilet with the caption “haha! Looks like he is having some trouble” is considered bullying, along with pictures of women menstruating through their clothing with the caption “someone lend her a pad. Please! Lol!”
But a picture of a woman bleeding through her underwear is allowed if there is no caption on it, because it doesn’t qualify as bullying.
Facebook does make exceptions regarding public figures, however, who are allowed to be shown “in the process of urinating, defecating, vomiting and menstruating.”
Facebook, be it a media company or social network, is privately owned and allowed to enforce whatever rules it sees fit. With an audience of over 1 billion users, objectionable content is guaranteed to appear, and the content moderators have their work cut out for them. But at least they know how to handle pictures of Harry Styles vomiting in public.