If
Facebook hasn't received enough flak for the emotional manipulation study it conducted on its user base, the company could soon face more from regulators. Last week, privacy watchdog group the Electronic Privacy Information Center (EPIC) filed a complaint with the Federal Trade Commission (FTC) over the one-week study Facebook conducted in 2012 that manipulated users' news feeds.
In EPIC's filing to the FTC, the group asserts that Facebook "purposefully messed with people's minds" while failing to "follow standard ethical protocols for human subject research." The group believes that Facebook ran afoul of several guidelines, including an FTC consent order and a failure to inform users.
"At the time of the experiment, Facebook did not state in the data-use policy that user data would be used for [psychological] research purposes," said EPIC in the complaint. "Facebook also failed to inform users that their personal information would be shared with researchers. Moreover, at the time of the experiment, Facebook was subject to a consent order with the Federal Trade Commission which required the company to obtain users' affirmative express consent prior to sharing user information with third parties."
As part of the relief, EPIC requests that the FTC launches an investigation into the "unlawful manipulation" of the news feed and data transfer to third parties that violated a 2012 consent order. EPIC asks that the FTC actually enforce the consent order it put into place, as well as force Facebook to make public the algorithm that is used on the news feed.
The full body of the complaint can be viewed at the EPIC
website.
EPIC saw success in the past in its filings against the social media company. A filing from 2009 and a follow-up in 2010 led to a settlement between the FTC and Facebook, which resulted in the 2012 consent order. The complaint was over the sharing of information when Facebook previously said the data was private.
Facebook is already facing an inquiry into the emotional study in the United Kingdom from the Information Commissioner's Office. The office is looking into the research to see if it broke any laws, including those around the age of the users selected for the study.
Since the research paper on "evidence of massive-scale emotional contagion through social networks" posted earlier this month, statements have come out regarding how the study should have been handled. Adam Kramer, one of the paper's coauthors, took to Facebook to explain the significance of the study, and acknowledge it could have been handled better. Facebook's Chief Operating O fficer Sheryl Sandberg emphasized that it was "poorly communicated" and claimed that the company was remorseful. However, Facebook has yet to issue any formal apology to the users subjected to the research.
The study manipulated the news feeds of over 600,000 Facebooks accounts, favoring either positive or negative posts based on the word content. While machines used an algorithm to filter the results, many are unhappy that they were subjected to the emotional manipulation the study proved could occur over the Internet. Left unresolved was whether those subjected to a steady diet of negative news were emotionally damaged, or had conditions such as depression made worse by the experiment.
As the issue of the research
faux pas continues, more information has come to light regarding the line in policy Facebook uses to justify the study. According to
Forbes writer Kashmir Hill, the line indicating that information could be used for research wasn't added until three months after the study, in May 2012. Facebook
posted a document outlining the changes, which were based on the current policy at the time from September 2011. The research line is one of the "
redline" items in the policy changes. EPIC confirms the September 2011 policy was in place in its FTC filing.