In a study to see if emotional states could be transferred to others online,
Facebook conducted a psychological experiment in January 2012 with its users as guinea pigs. According a research paper published this month, feeds from over 689,000 English-language accounts were altered for either positive or negative states for one week to see if there was an impact on mental states.
In the study, which was led by Facebook data "scientist" Adam Kramer, algorithms were used to remove postings from a user's news feed for either a positive or negative influence based on keywords that were filtered out. By using a machine to filter the words, the researchers could avoid reading any posts or other information that could fall under protection from privacy settings.
Both negative and positive experiments were carried out in parallel, with a 10 to 90 percent chance of information being filtered out for a username selected. The study said that while the news feed was altered, all of the information it filtered out could be accessed on the friend's feed or timeline. Content of direct messages was not affected.
Words were selected for filtering by Linguistic Inquiry and Word Count software. Control samples were done with random omissions without consideration for emotional context, as well as others to balance the disparity between the 22.4 percent of posts with negative words and 46.8 percent with positive words. The paper states that over three million posts were analyzed in the experiment.
Facebook stands behind the ability to manipulate users feeds based on a single line in its terms of service that every user must agree to. In the paper, the researchers point to the line in Facebook's Data Use Policy. According to Facebook's page on
information privacy, data can be used "for internal operations, including troubleshooting, data analysis, testing, research and service improvement.
"As such, it was consistent with Facebook's Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research," states the paper.
This raises concern for privacy issues, especially when the results of the study are considered. Those using Facebook have consented to data manipulation based on the idea that the social media company could be using the data to improve its services or research projects. Since the data of the study, using a large data sample, proves that social media can manipulate one aspect of a person by controlling information, it can be said that the flow of information has a profound effect on emotional states or thought patterns.
While data can be changed to alter emotions, it's important to see that there is a larger impact at hand; what is said on the Internet does affect others. The study found that emotional contagion could occur without the need to have a face-to-face interaction or witness nonverbal cues. In fact, the context of text alone is sufficient enough to invoke the emotional shift.
A withdrawal effect was also noticed during the course of the study, meaning that those seeing fewer emotional posts showed less expressiveness in the days after. Results indicate that the people seeing more positive items in their news feeds posted more positive words, while those seeing more negative items posted more negative words in turn.
"These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks," states the paper. "This work also suggests that, in contrast to prevailing assumptions, in-person interaction and nonverbal cues are not strictly necessary for emotional contagion, and that the observation of others' positive experiences constitutes a positive experience for people."
The paper on the study can be found at the Proceedings of the Natation Academy of Sciences
website.