icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
2 May, 2018 16:16

Facebook will rank news sources by ‘trust’ – but who does Facebook trust?

Facebook will rank news sources by ‘trust’ – but who does Facebook trust?

Facebook will now rank news sources by “trustworthiness,” promoting or suppressing content based on this criterion. However, news organizations enlisted to help define what “trustworthy” is seem to reveal a strong political bias.

Speaking at Facebook’s annual F8 developer conference on Tuesday – a flashy two-day affair of keynote speeches and after-parties for the Silicon Valley elite – Zuckerberg said that the company has already gathered data from users, who it asked to identify various news brands and score them by trust.

“We put [that data] into the system, and it is acting as a boost or a suppression, and we’re going to dial up the intensity of that over time,” he said, according to Buzzfeed News. "We feel like we have a responsibility to further [break] down polarization and find common ground.”

Facebook, however, has been repeatedly accused of anti-conservative bias in its ranking algorithms. In his hearing before Congress last month, Republican representatives grilled Zuckerberg over reports that his platform routinely censors right-wing posts, like those by pro-Donald Trump vloggers Diamond and Silk. Zuckerberg denied these accusations, and called these incidents of censorship isolated and a “mistake.”

Slant

At the F8 conference, Zuckerberg met a group of media executives to discuss how Facebook will promote or censor news stories. Taking place behind closed doors, the meeting was named “OTR,” short for “Off the Record.” The composition of the group present will do little to reassure anybody concerned about bias or polarization.

Representatives came from BuzzFeed News, The Information, Quartz, the New York Times, CNN, the Wall Street Journal, NBC, Recode, Univision, Barron’s, the Daily Beast, The Economist, the Huffington Post, Insider, The Atlantic, the New York Post, and others.

READ MORE: New York Times issues fake-news correction in article about fake news

Of all the major news organizations present, only two are considered in any way right of centre by AllSides’ rankings. All of the media organizations with the exception of the finance-focused Wall Street Journal and the tabloid New York Post lean sharply to the political left.

Content police

Facebook announced last month that it would be stepping up its content-policing efforts, partnering with third-party “fact checkers” including AP and AFP to verify news, photos, and videos. At present, the company aims to employ over 20,000 staff dedicated to tackling fake news and hate speech by the end of 2018.

These virtual ‘thought-policemen’ will be aided by artificial-intelligence programs designed to pre-emptively weed out false information and propaganda. On Tuesday, Zuckerberg said that his company will invest “billions” of dollars into its content crackdown.

READ MORE: Britain's fake news committee issues summons threat to Facebook CEO Zuckerberg

As outlined in a blog post last month, the clampdown is four-pronged. It targets actors that impersonate others; tricks used to artificially expand the audience for a particular message; the assertion of false information; and the spreading of false narratives.

The last target is the troubling one. Facebook describes “false narratives” as “intentionally divisive headlines and language that exploit disagreements and sow conflict.” Even Facebook’s own team are unsure what this means.

“This is the most difficult area for us, as different news outlets and consumers can have completely different [views] on what an appropriate narrative is, even if they agree on the facts,” Chief Security Officer Alex Stamos said.

Stamos himself is currently seeing out the last few months of his tenure as CSO. Last year, he spoke about about the dangers of filtering news in a series of tweets. Stamos warned that fake-news filtering would lead to Facebook “becoming the Ministry of Truth with machine-learning systems trained on your personal biases.”

Stamos ended his tweetstorm with the ominous sounding: “A lot of people aren’t thinking hard about the world they are asking Silicon Valley to build. When the gods wish to punish us they answer our prayers.”

Hate speech

On the same day as Zuckerberg’s sit-down with the mainstream media, Facebook premiered a surprise new feature. Beneath almost every public post, users could now see a notification asking “Does this post contain hate speech?” beside two buttons marked “Yes” and “No.”

Minutes later, the notification was gone. Facebook VP Guy Rosen later explained on Twitter that the feature was an internal test that had accidentally been made public.

Zuckerberg struggled to give a clear definition of hate speech when he appeared before Congress last month. “The question of what is hate speech and what is legitimate political speech is something we get criticism from both left and right,” he said. “It’s nuanced. We try to lay this out in our community standards.”

Facebook’s community standards page had been scant on details until a few weeks ago, when it was updated and expanded to include a new definition of hate speech.

According to the community standards page, hate speech is “a direct attack on people based on what we call protected characteristics – race, ethnicity, national origin, religious affiliation, sexual orientation, sex, gender, gender identity, and serious disability or disease. We also provide some protections for immigration status.”

If the reporting option rolled out on Tuesday is anything to go by, Facebook may be trying to supplement this definition by crowdsourcing, just like how users’ trustworthiness ratings are being used to rank news sources.

News empire

Zuckerberg’s meeting with top news executives was about more than just content policing. At the meeting, he laid out his vision for the future of news on Facebook. As well as ensuring that “people can get trustworthy news on our platform,” Zuckerberg declared his intent to fund investigative journalism projects, and support journalistic non-profits.

But no matter what steps Facebook takes to enter the world of journalism, many users simply don’t want to see news on Facebook as much less curated and filtered news. Below Buzzfeed’s article on Tuesday’s meeting, comments flowed in.

“Mark, I need you to rank NOTHING For me. I'm an adult. I can make my own decisions. Your hate speech bug did me in today. I will read a news site for my news thank you, and I will come to FB to bs with my friends and talk to people about dogs and cooking. Got to keep you in your proper perspective of relevance. And you, sir, are not a source I would ever trust for my news,” read one comment from Geli Von Der Sauk.

“I don't remember asking Mr Zuckerburg to rate the news for me. I can figure out which is stupid, fake and reasonably balanced, thanks anyway,” read another from David Harder.

Earlier this year, Facebook changed its news feed, giving users more content from friends and family and less news articles and videos. Zuckerberg introduced these changes to increase “meaningful interaction” on the platform and “make sure that our products are not just fun, but are good for people.”

Mere months later, the social media monolith could be gearing up for a radical about-turn on that policy.

Podcasts
0:00
25:36
0:00
25:12