icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm

Facebook has revealed how it suppresses news it doesn’t like and decides what’s true and what isn’t. It’s chillingly Orwellian

Micah Curtis
Micah Curtis

is a game and tech journalist from the US. Aside from writing for RT, he hosts the podcast Micah and The Hatman, and is an independent comic book writer. Follow Micah at @MindofMicahC

is a game and tech journalist from the US. Aside from writing for RT, he hosts the podcast Micah and The Hatman, and is an independent comic book writer. Follow Micah at @MindofMicahC

Facebook has revealed how it suppresses news it doesn’t like and decides what’s true and what isn’t. It’s chillingly Orwellian
Facebook has finally detailed its procedures for weeding out ‘disinformation’ on its site. The process shows exactly why the dystopian attitudes of this monstrous business must be stood up to.

The idea of Facebook censoring news and using algorithms to keep people away from certain news sources has long been known. There’s been a lot of talk about initiatives to cut down on ‘fake news’ and misinformation, but never a confirmation from the company itself on exactly how it was doing this. Now Facebook has revealed its content distribution guidelines, and they are as Orwellian as you would expect.

According to a report by the Daily Wire, the methods used are ones that are very open to exposure from astroturfers, and have a vulnerability in the sense that what is defined as misinformation can simply be anything that isn't coming from a politically leftist point of view. The language used in terms like “fostering a safe community” and “incentivizing creators to invest in high quality and accurate content” doesn’t really have any clear definitions. Who is it that defines what is safe? Who is it that defines what is accurate? 

Now I do believe that absolute truth does exist, but anytime a story is reported on or opined on there can be some dispute over the facts, let alone the conclusions to be drawn from them. Normally speaking, this is where the audience has the opportunity to discern what is and what is not, and make up their own minds. 

In Facebook having the final say with regards to what is supposedly safe and supposedly true, the company is making that determination for the person who uses its website. The company says it is better at doing the thinking and decision-making for the user than they themselves are.

Also on rt.com Two-faced Facebook’s hypocrisy on freedom of speech reveals it is now a clear and present danger to democracy

There had been pressure on Facebook for some time to start doing things like this, some of which came from Joe Biden and his campaign. No surprise there, as it is their world viewpoint that will come to the fore on Facebook, while that of their opponents will be suppressed. 

The confirmation of Facebook's willingness to take such a stand is mortifying. Now the website that you use to keep in touch with your friends and family and share recipes or other such communications is determining what is true, and what's not, for you. This positions Facebook as the world’s self-appointed arbiter of truth. 

The effects of its decision are twofold. The first is that any news organization such as RT or others can be branded as misinformation simply based on not having the right opinion (as defined by Facebook). The second is that it enshrines legacy left-wing media as the main – perhaps only – purveyors of what Facebook thinks and tells the world is correct.

Also on rt.com Facebook’s smart glasses are an attempt to profit from our narcissistic culture, and also a cause for privacy concern

The idea of a faceless Facebook employee sitting at a computer and deciding what is true and what is not conjures up the image of Winston Smith burning articles of yesteryear in the pages of ‘1984’. The corporate language that Facebook uses is just as purposely vague as that of The Party in that same novel. It's designed to be vague, because those with power over the situation can change what it means to suit their needs as they please, while taking agency away from those who use the site.

Any time our society gets closer to George Orwell's prophetic work is spine-chilling. It should be something that we avoid, not something that we use as an instruction manual. The idea of a corporation taking away the rights of normal people is as mortifying as a government doing it, if not more; who appointed them or voted for them to have these powers over our lives? Whether it's Mark Zuckerberg or Joe Biden, everyone – regardless of political position – should look at this situation and be reviled by it.

As it stands, there is one saving grace to this situation. Going to Facebook is a choice that everyone has. They cannot force you to do things, like a government can, at the point of a gun. The obvious solution to this problem is simply to stop using Facebook for your news. The modern convenience Facebook offers is a wonderful thing, but the ability to read what you want and read who you trust is much more important. There has to be a point where people put their foot down with regards to Silicon Valley determining what we can and cannot read or think.

Maybe if enough people walk away, there will be some sort of realization that Facebook ultimately is no different than a coffee joint or a pizza place. We go there for a simple service, not for them to preach to us what is true and what is not.

The statements, views and opinions expressed in this column are solely those of the author and do not necessarily represent those of RT.

Podcasts