In a public Facebook post, Adam Kramer, a member of the team of three Facebook data scientists that published a controversial emotional contagion study carried out on the social network, responded to the criticisms that their research has received.
The study has been met with backlash because of the research report's manipulation of the moods of 700,000 Facebook users without them knowing it.
The research entitled "Experimental evidence of massive-scale emotional contagion through social networks," sought to prove that emotional contagion or the phenomenon wherein emotions of one can have a direct effect on that of another can be done so through Facebook.
The report was a success, proving that emotional contagion can indeed be replicated online on the social network. Users that read more negative posts on their News Feed would in turn post more negative status updates, and vice versa.
However, the criticism isn't in the purpose or the results of the study. Rather, the uproar is against the way that the research was carried out.
To carry out the experiment and collect data, the team of data scientists manipulated the status updates that showed up on the News Feed of the selected users. According to the team, the data use policy of Facebook allows the experiment, as a clause in the policy states that Facebook user data may be used for "internal operation, including troubleshooting, data analysis, testing, research and service improvement."
"Regarding methodology, our research sought to investigate the above claim by very minimally deprioritizing a small percentage of content in News Feed (based on whether there was an emotional word in the post) for a group of people (about 0.04% of users, or 1 in 2500) for a short period (one week, in early 2012)," Kramer said in his Facebook post, defending their study.
Kramer added that no posts were "hidden" from the users, but rather just did not appear on certain loads of the News Feed. The posts were still viewable on the timelines of the other users, and may also have shown up as the News Feed was refreshed.
"Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone," Kramer apologizes, but then adds that the benefits obtained from the research do not justify all the anxiety that it has caused.
Criticism on the study stemmed from the fact that users did not want to be involved in an experiment without their prior consent, despite their agreement to such as they agreed to the terms of service of Facebook. In addition, users found it unsettling that the social network is willing to manipulate the emotions of users, even making them feel more negative for the sake of research.