Facebook 'echo chamber' makes people more narrow-minded – study
The study, published in the journal Proceedings of the National Academy of Sciences, examined data on topics people discussed on the social network between 2010 and 2014. The information was categorized into three groups: science news, conspiracy rumors, and trolling.
Despite the world being at their fingertips, the study found that users “tend to aggregate in communities of interest, which causes reinforcement and fosters confirmation bias, segregation, and polarization.”
"This comes at the expense of the quality of information and leads to proliferation of biased narratives fomented by unsubstantiated rumors, mistrust, and paranoia,” wrote the authors.
The researchers, from Boston University and various Italian institutions, referred to this phenomenon as an “echo chamber,” in which a network of like-minded people share controversial theories, biased views, and selective news. That information is then simply repeated back to them and accepted as fact.
"Our findings show that users mostly tend to select and share content related to a specific narrative and to ignore the rest," the authors said.
"Whether a news item, either substantiated or not, is accepted as true by a user may be strongly affected by how much it coheres with the user's system of beliefs,” they added.
The authors noted the “echo chamber” may be the reason certain phenomena have become widespread, such as the rejection of global warming evidence. They went on to state that while scientific information can often be traced, the origins of conspiracy theories are difficult to identify.
The study also mentions that the issue of unreliable information going “viral” online has become so serious that it has been classed as one of the biggest social threats by the World Economic Forum.
"Massive digital misinformation is becoming pervasive in online social media to the extent that it has been listed by the World Economic Forum as one of the main threats to our society," the authors wrote.