icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm

‘Weaponizing its terms of service’: Facebook FORCED shutdown of project monitoring Instagram’s algorithm, German researcher says

‘Weaponizing its terms of service’: Facebook FORCED shutdown of project monitoring Instagram’s algorithm, German researcher says
German social-media researcher AlgorithmWatch said it was forced to abandon a project monitoring Instagram’s newsfeed algorithm after the Big Tech platform’s parent, Facebook, threatened to take punitive action.

“Ultimately, an organization the size of AlgorithmWatch cannot risk going to court against a company valued at $1 trillion,” the German not-for-profit group said on Friday. The project was halted last month, and AlgorithmWatch said it chose to speak out now – highlighting the need for lawmakers to protect future research of online platforms – after Facebook shut down the accounts of researchers at New York University’s Ad Observatory.

The group’s project involved recruiting 1,500 volunteers to install a browser extension that scraped data from their Instagram newsfeeds. In the project’s first 14 months, AlgorithmWatch made such findings as: Instagram encouraged content creators to post certain types of pictures, such as shirtless men or women wearing bikinis or underwear; and politicians were more likely to reach a larger audience if they refrained from using text in their posts.

But when the researcher asked Facebook for comment on its findings, the tech giant said the project was “flawed in a number of ways,” and it later said it had found shortcomings in the study’s methodology. Facebook didn’t spell out what those flaws and shortcomings were, AlgorithmWatch said, and it came back in May 2021 to claim that the group had violated its terms of service.

Those terms prohibit automated collection of data, but the German group said it had only gathered content from volunteers who’d installed its browser extension. “In other words, users of the plug-in were only accessing their own feed and sharing it with us for research purposes,” AlgorithmWatch said.

Nevertheless, the group said Facebook made a “thinly veiled threat,” saying it would “move to more formal engagement if we did not resolve the issue on their terms.” About six weeks later, AlgorithmWatch closed the project and deleted its data.

The group accused Facebook of “bullying” and of “weaponizing” its terms of service. “Facebook’s reaction shows that any organization that attempts to shed light on one of their algorithms is under constant threat of being sued,” AlgorithmWatch said. “Given that Facebook’s terms of service can be updated at their discretion with 30 days’ notice, the company could forbid any ongoing analysis that aims at increasing transparency.”

Also on rt.com ‘Trusted Friends’ and ‘hateful’ language filter: Twitter’s concept features to allow users to choose who & what they want to hear

A Facebook spokesperson told the Daily Caller that Facebook didn’t threaten to sue AlgorithmWatch, and it tried to work with the researcher on ways to continue the project without violating user privacy. “We believe in independent research into our platform and have worked hard to allow many groups to do it, including AlgorithmWatch, but just not at the expense of anyone’s privacy,” the Facebook spokesperson said. 

One of the concerns Facebook raised was that the German project had collected some data from Instagram users who weren’t volunteer participants. AlgorithmWatch countered that any data from outside its participant group was deleted immediately when it arrived at the group’s server.

The researcher said it’s urgent to shed light on Instagram’s algorithms, noting such apparent manipulation of public opinion as posts on protests in Colombia disappearing and certain types of Palestinian content being removed. Many users have been shadow-banned, meaning their posts go out but aren’t viewable by others.

Also on rt.com Facebook bans phrase ‘stop the steal’ to protect Biden inauguration as it widens post-riot crackdown on speech

“Without independent public-interest research and rigorous controls from regulators, it is impossible to know whether Instagram’s algorithms favor specific political opinions over others,” AlgorithmWatch said.

The group added that major social-media platforms play an “oversized” and “largely unknown” role in shaping public opinion, including voting choices, and transparency is needed to hold them accountable. “Only if we understand how our public sphere is influenced by their algorithmic choices can we take measures toward ensuring they do not undermine individuals’ autonomy, freedom and the collective good,” AlgorithmWatch said.

 

Like this story? Share it with a friend!

Podcasts