Is political pressure behind YouTube’s video labeling?
Senator Mark Warner (D-Virginia) slammed the video service for “optimizing for outrageous, salacious and often fraudulent content” and encouraging “manipulation by bad actors, including foreign intelligence entities,” after the UK-based Guardian published a story about YouTube’s recommendation algorithm.
The February 2 article argued that YouTube’s algorithms ‘distort truth’ by providing users with personalized video recommendations based on their viewing history. That code determines which videos are promoted in the 'Up next' box.
YouTube/Google is now apparently labeling all state-funded media as such. Interesting move in response to pressure from the US government to crackdown specifically on Russian media. pic.twitter.com/S8I24XRRGB— Dan Cohen (@dancohen3000) February 3, 2018
“Companies like YouTube have immense power and influence in shaping the media and content that users see,” Warner told the publication. “I’ve been increasingly concerned that the recommendation engine algorithms behind platforms like YouTube are, at best, intrinsically flawed in optimizing for outrageous, salacious and often fraudulent content.”
The newspaper argued that this amounted to a digital threat to representative democracy. Its investigation however was largely based on the research by former Google and Microsoft employee Guillaume Chaslot.
“On the eve of the US Presidential Election… more than 80% of recommended videos were favorable to Trump, whether the initial query was ‘Trump’ or ‘Clinton’,” Chaslot wrote.
I'm proud to have contributed to shedding light on how fake news outperformed reality on YouTube during the 2016 presidential campaign. We need to take back control of our information. First step: transparency. https://t.co/SUWTJP9e0F— Guillaume Chaslot (@GChaslot) February 2, 2018
Chaslot was fired by Google in 2013, “ostensibly over performance issues,” as the Guardian describes it, adding that Chaslot himself insists he was “let go after agitating for change” at the company.
Google initially dismissed the findings of Chaslot’s research, saying it “strongly disagreed” with its methodology, data and conclusions.
“It appears as if the Guardian is attempting to shoehorn research, data and their conclusions into a common narrative about the role of technology in last year’s election,” the paper quoted a Google spokesperson. “The reality of how our systems work, however, simply doesn’t support this premise.”
Following the recent publication of social media companies’ responses to the Senate Intelligence Committee last week, however, Google changed its tune.
“We appreciate the Guardian’s work to shine a spotlight on this challenging issue,” Google said. “We know there is more to do here and we’re looking forward to making more announcements in the months ahead.”
Warner is one of the most outspoken advocates of blaming Russia for the 2016 electoral defeat of Hillary Clinton, raking Twitter, Facebook and YouTube executives over the coals at the October 2017 hearing for allowing “Russian” content on their platforms.
“Russian operatives are attempting to infiltrate and manipulate social media to hijack the national conversation and to make Americans angry, to set us against ourselves, and to undermine our democracy,” Warner declared at the time.
Whether this is Warner and the Guardian’s doing or not, YouTube users in the US now get a warning on videos generated by government-backed creators. Sometimes that leads to hilarious consequences, such as when the Venezuela-based Telesur gets described as backed by “the Latin American government.” Telesur’s backers are the governments of Venezuela, Cuba, Ecuador, Nicaragua, Uruguay and Bolivia.
On all videos on TeleSUR's Spanish- and English-language channels, YouTube wrote a disclaimer that says it is funded "by the Latin American government."Because you know, there's only one government in Latin America pic.twitter.com/AWelEWgqK0— Ben Norton (@BenjaminNorton) February 5, 2018
Think your friends would be interested? Share this story!