By James Hirsen Special For USDR
Over the past weekend, an astonishing bit of information surfaced concerning social media giant Facebook.
In early June of 2014, a research study, which was conducted by Facebook researchers along with academics from Cornell University and the University of California at San Francisco, appeared in a prominent academic journal, the Proceedings of the National Academy of Sciences.
The journal article described the results of researchers’ attempts at evaluating users’ emotional responses on the Facebook website through modification of news feeds. The Facebook site was apparently experimenting to find out whether users could be made to undergo certain emotional changes through what the study referred to as “emotional contagion.”
According to the journal article, two university researchers, who worked in conjunction with a Facebook data scientist, altered the news feeds of 689,003 users. The research, which modified the algorithm used to place posts into user news feeds, was conducted in 2012 over an approximate one-week period.
The purported purpose of the study was to determine whether the emotional state of users who are engaging in social networking activity is “contagious.” In other words, can an emotional state be transmitted to other online participants? When positive content is posted, does it elicit more positive content being posted in response, and conversely, when more negative content is posted, does it elicit more negative content being posted in response?
As the social media story continues to unfold, it is important to note that the intended purpose of the research study may differ from the purported one. The intended purpose may actually have centered around an objective of being able to potentially instill in Facebook users’ minds feelings of happiness, sadness, or any variety of emotions for yet unknown reasons or purposes not currently revealed.
Evidently, without users having been provided prior knowledge or the company having first obtained the consent of research study participants, Facebook altered the comments, links, pictures, and videos that are posted by companion users within an individual’s social network circle; this is what is referred to by Facebook as a “news feed.”
The experiment reduced the “positive emotional content” of the news feed and monitored the posts of particular users. In another variation, the Facebook site cut down the “negative emotional content” to determine whether users’ emotions would be impacted.
“Emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks,” the authors of the study concluded.
When it was discovered that the study had been conducted and data had been collected unbeknownst to Facebook participants, word quickly traveled throughout cyberspace. As a result, social media sites were replete with emotion of a possibly unanticipated and unintended kind. An intense social media anger was brewing over the fact that the clandestine research had taken place at all.
Facebook has responded to the negative reaction to the study via a post by one of its data scientists, Adam Kramer, who assisted in the research. Kramer posted on his Facebook page that Facebook conducted the secret study for the benefit of those who use the social media platform.
“We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook… In hindsight, the research benefits of the paper may not have justified all of this anxiety,” the post read.
The truth of the matter is for academic research to be considered ethical and/or legitimate, participants must be informed about the study and agree to participate or have the ability to opt out without incurring any penalty. Manipulating the sentiments of hundreds of thousands of users without their permission is clearly outside of the ethical boundaries.
Facebook, according to the language in the study, justified the stealth research on unsuspecting users by relying on the social media site’s “Data Use Policy,” to which all users must agree prior to creating an account. Most users typically act reflexively and accede without having first thoroughly read and obtained a clear understanding of the implications involved. Unfortunately for some of the irate Facebook users, buried in the boilerplate of the agreement is language that users’ data may be used for “research” purposes.
There are, however, still a number of ethical questions raised by the Facebook study, and there are legal issues that present themselves as well. Facebook did not obtain informed consent, as outlined by U.S. federal policy for the protection of human subjects who are participating in research studies. The social media company failed to disclose the purpose of the study as well as the attendant risks involved. It also neglected to provide details concerning the duration of the research and to make clear that participation in the research study would be on a voluntary basis.
On a side note, the Facebook site’s covert research had received a portion of financing from a government source, the Army Research Office, according to a Cornell profile of one of the academic researchers.
Average Rating