Welcome to the MacNN Forums.

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

You are here: MacNN Forums > News > Tech News > Privacy group files complaint with FTC over Facebook emotional study

Privacy group files complaint with FTC over Facebook emotional study
Thread Tools
NewsPoster
MacNN Staff
Join Date: Jul 2012
Status: Offline
Reply With Quote
Jul 6, 2014, 03:13 PM
 
If Facebook hasn't received enough flak for the emotional manipulation study it conducted on its user base, the company could soon face more from regulators. Last week, privacy watchdog group the Electronic Privacy Information Center (EPIC) filed a complaint with the Federal Trade Commission (FTC) over the one-week study Facebook conducted in 2012 that manipulated users' news feeds.

In EPIC's filing to the FTC, the group asserts that Facebook "purposefully messed with people's minds" while failing to "follow standard ethical protocols for human subject research." The group believes that Facebook ran afoul of several guidelines, including an FTC consent order and a failure to inform users.

"At the time of the experiment, Facebook did not state in the data-use policy that user data would be used for [psychological] research purposes," said EPIC in the complaint. "Facebook also failed to inform users that their personal information would be shared with researchers. Moreover, at the time of the experiment, Facebook was subject to a consent order with the Federal Trade Commission which required the company to obtain users' affirmative express consent prior to sharing user information with third parties."

As part of the relief, EPIC requests that the FTC launches an investigation into the "unlawful manipulation" of the news feed and data transfer to third parties that violated a 2012 consent order. EPIC asks that the FTC actually enforce the consent order it put into place, as well as force Facebook to make public the algorithm that is used on the news feed.

The full body of the complaint can be viewed at the EPIC website.

EPIC saw success in the past in its filings against the social media company. A filing from 2009 and a follow-up in 2010 led to a settlement between the FTC and Facebook, which resulted in the 2012 consent order. The complaint was over the sharing of information when Facebook previously said the data was private.

Facebook is already facing an inquiry into the emotional study in the United Kingdom from the Information Commissioner's Office. The office is looking into the research to see if it broke any laws, including those around the age of the users selected for the study.

Since the research paper on "evidence of massive-scale emotional contagion through social networks" posted earlier this month, statements have come out regarding how the study should have been handled. Adam Kramer, one of the paper's coauthors, took to Facebook to explain the significance of the study, and acknowledge it could have been handled better. Facebook's Chief Operating O fficer Sheryl Sandberg emphasized that it was "poorly communicated" and claimed that the company was remorseful. However, Facebook has yet to issue any formal apology to the users subjected to the research.

The study manipulated the news feeds of over 600,000 Facebooks accounts, favoring either positive or negative posts based on the word content. While machines used an algorithm to filter the results, many are unhappy that they were subjected to the emotional manipulation the study proved could occur over the Internet. Left unresolved was whether those subjected to a steady diet of negative news were emotionally damaged, or had conditions such as depression made worse by the experiment.

As the issue of the research faux pas continues, more information has come to light regarding the line in policy Facebook uses to justify the study. According to Forbes writer Kashmir Hill, the line indicating that information could be used for research wasn't added until three months after the study, in May 2012. Facebook posted a document outlining the changes, which were based on the current policy at the time from September 2011. The research line is one of the "redline" items in the policy changes. EPIC confirms the September 2011 policy was in place in its FTC filing.
( Last edited by NewsPoster; Jul 8, 2014 at 02:56 AM. )
     
FireWire
Mac Elite
Join Date: Oct 1999
Location: Montréal, Québec (Canada)
Status: Offline
Reply With Quote
Jul 6, 2014, 05:29 PM
 
come on! people are a bunch of assholes.. people get studied all the time. you're using a service for free, and you're already giving all your life details away to facebook anyway, don't complain if they use your data for research!
     
Mike Wuerthele
Managing Editor
Join Date: Jul 2012
Status: Offline
Reply With Quote
Jul 6, 2014, 06:37 PM
 
Plus, not to mention, when you clicked through the FaceBook TOS, you agreed to stuff like this anyhow.
     
DiabloConQueso
Grizzled Veteran
Join Date: Jun 2008
Status: Offline
Reply With Quote
Jul 6, 2014, 07:24 PM
 
I think it stems from the fact that people have a very different definition of "privacy," and who, precisely, is responsible for protecting it these days than they did 10 or 20 years ago.

It used to be that if you didn't want someone to know something, you kept your mouth shut. These days, it seems to be that if you don't want someone to know something, you tell it to people anyway and expect them to keep their mouth shut.

People seem to overlook the fact that the biggest invader of their privacy is their own selves.
     
elroth
Forum Regular
Join Date: Jul 2006
Status: Offline
Reply With Quote
Jul 7, 2014, 01:42 AM
 
This is much more than just giving out your personal information (as if that isn't bad enough). Facebook changed peoples' news feeds, to psychologically manipulate the people who received them. The study showed that people who receive more negative news feeds subsequently post more negative comments. Could this add to some peoples' depression? It's horrible for Facebook to do this - their stated purpose was to psychologically manipulate people.
     
Charles Martin
Mac Elite
Join Date: Aug 2001
Location: Maitland, FL
Status: Offline
Reply With Quote
Jul 7, 2014, 03:06 AM
 
There is a very real difference between consenting to have your what you do on FB collected for research (which every FB user agreed to) and consenting to have Facebook *manipulate you mentally* and then study the results (which no FB user has ever agreed to, and is pretty clearly in violation of the FTC decree). They crossed a huge bright line IMO and are very likely to get hit pretty hard for this. It's very reflective of the attitude FB has towards its users ("lab rats") and the attitude it has about the FTC agreement. I predict a lot of successful legal action over this.
Charles Martin
MacNN Editor
     
Mike Wuerthele
Managing Editor
Join Date: Jul 2012
Status: Offline
Reply With Quote
Jul 7, 2014, 11:11 AM
 
Originally Posted by chas_m View Post
There is a very real difference between consenting to have your what you do on FB collected for research (which every FB user agreed to) and consenting to have Facebook *manipulate you mentally* and then study the results (which no FB user has ever agreed to, and is pretty clearly in violation of the FTC decree). They crossed a huge bright line IMO and are very likely to get hit pretty hard for this. It's very reflective of the attitude FB has towards its users ("lab rats") and the attitude it has about the FTC agreement. I predict a lot of successful legal action over this.
How is a headline or the tone of an article not "manipulating you mentally?" It's not like there's FaceBook orbital mind control lasers operating in conjunction with the experiment.
     
Flying Meat
Senior User
Join Date: Jan 2007
Location: SF
Status: Offline
Reply With Quote
Jul 7, 2014, 04:02 PM
 
Uh, because that wasn't the content as it was originally to be delivered. It was altered. Altered by Facebook. So the content is less likely to be interpreted as it was intended. Instead of "It certainly could be worse." we have "It certainly couldn't be worse."
Communication is hard enough, without someone actually changing what you said.
There is no reason to ever expect Facebook to do something like that except maybe in cases of excessive swearing, or explicit sexual dialog. There literally is no good reason for them to have done this.
     
   
Thread Tools
 
Forum Links
Forum Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Top
Privacy Policy
All times are GMT -4. The time now is 10:30 PM.
All contents of these forums © 1995-2017 MacNN. All rights reserved.
Branding + Design: www.gesamtbild.com
vBulletin v.3.8.8 © 2000-2017, Jelsoft Enterprises Ltd.,