Facebook manipulated users' emotions as part of psychological experiment – study
Facebook conducted a psychological experiment on its users by manipulating their emotions without their knowledge, a new study reveals.
Researchers toyed with the feelings of 689,003 randomly selected
English-speaking Facebook users by changing the contents of their
news feed, according to a paper published in the June edition of the journal
'Proceedings of the National Academy of Scientists' (PNAS).
During a week-long period in January 2012, researchers staged two parallel experiments, reducing the number of positive or negative updates in each user's news feed.
"When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks," said the authors of the paper, who include researchers from Facebook, Cornell University, and the University of California.
“We also observed a withdrawal effect: People who were exposed to fewer emotional posts (of either valence) in their News Feed were less expressive overall on the following days.”
The researchers indicated that the successful study is the first to find that moods expressed via social networks influence the emotions of others.
“These results suggest that the emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks, and providing support for previously contested claims that emotions spread via contagion through a network.”
The Facebook users were not notified of the experiment. However, according to Facebook's terms of service (to which every person agrees when they register on the social network), users’ data may be used “for internal operations, including troubleshooting, data analysis, testing, research and service improvement.”
The researchers argue that their experiment was consistent with Facebook’s Data Use Policy.
The paper also stated that the researchers never saw the content of the actual posts; instead, they relied on a computer which counted the occurrence of positive and negative words in more than three million status updates. Those posts contained a total of 122 million words; four million of those were positive (3.6%) and 1.8 million were negative (1.6%).
The significance of the research was reduced to a very small percentage, as the “emotional contagion” was estimated at only 0.1 percent. However, one can argue that with more than 1.3 billion Facebook users worldwide, that small percentage still includes a significant amount of people.