Facebook apologizes after secret psychological experiments caused outrage among users
Facebook was abuzz with complaints over the weekend after it was revealed that the social networking site secretly manipulated the posts being seen by nearly 700,000 users in early 2012 in order to let data analysts get a glimpse at how emotional states are transmitted over the platform.
During a brief period two years ago, Facebook altered the content that showed up on certain users’ news feed to control the portion of posts that contained words with positive or negatively charged emotions. By looking at a week’s worth of data, researchers at Facebook found that “emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks, and providing support for previously contested claims that emotions spread via contagion through a network.”
News of the study quickly caused users of the site to become outraged, prompting one of the authors involved in the report to take to his own Facebook profile on Sunday to offer an explanation of sorts.
“The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product,” Adam Kramer wrote. “We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook. We didn't clearly state our motivations in the paper.”
“The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety,” Kramer added.
“While we’ve always considered what research we do carefully, we (not just me, several other researchers at Facebook) have been working on improving our internal review practices. The experiment in question was run in early 2012, and we have come a long way since then. Those review practices will also incorporate what we’ve learned from the reaction to this paper.”
Even still, some critics of the study say that Facebook’s social media manipulation represents only a sliver of the kind of problematic privacy issues brought on by its competitors on a regular basis.
“It will make people a little bit nervous for a couple of days,” University of Texas psychology professor James Pennebaker told Bloomberg News this week. “The fact is, Google knows everything about us, Amazon knows a huge amount about us. It’s stunning how much all of these big companies know. If one is paranoid, it creeps them out.”
Kramer’s study — co-authored by Cornell’s Jeff Hancock and Jamie Guillory — was published in the latest edition of the journal 'Proceedings of the National Academy of Scientists,’ and concluded that “emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”