Facebook under fire over secret experiment on users

Facebook has come under fire for conducting a psychology experiment on 689,000 users without their consent

Facebook has come under fire for conducting a psychology experiment on 689,000 users without their consent.

Cornell University and the University of California were involved in the experiment conducted one week in 2012 in which Facebook filtered users’ newsfeeds to study the effects on users’ emotions.

One test reduced users' exposure to their friends' "positive emotional content", resulting in fewer positive posts of their own.

Another test reduced exposure to "negative emotional content”, resulting in fewer negative posts by those selected for the test.

The study concluded: "Emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks."

Facebook has defended the experiment by saying there was "no unnecessary collection of people's data" and that “none of the data used was associated with a specific person's Facebook account."

But publication of the study report has unleashed criticism mainly of the way the research was conducted and raised concerns over the impact such studies could have, reports the BBC.

The study has also raised fears that the process could be used for political purposes or to boost social media advertising revenues.

Labour MP Jim Sheridan, a member of the Commons media select committee has called for a parliamentary investigation into how Facebook and other social networks manipulate emotional and psychological responses of users by editing information supplied to them.

"This is extraordinarily powerful stuff and if there is not already legislation on this, then there should be to protect people," he told the Guardian.

"They are manipulating material from people's personal lives and I am worried about the ability of Facebook and others to manipulate people's thoughts in politics or other areas.

"If people are being thought-controlled in this kind of way there needs to be protection and they at least need to know about it,” said Sheridan.

According to Facebook data scientist Adam Kramer, who co-authored the report on the research, the social networking firm felt that it was important to investigate concerns that seeing friends post positive content leads to people feeling negative or left out.

Facebook was also concerned that exposure to friends' negativity might lead people to avoid visiting the site, but Kramer admitted that the firm did not "clearly state our motivations in the paper".

"I can understand why some people have concerns about it, and my co-authors and I are very sorry for the way the paper described the research and any anxiety it caused,” he said in a statement.

The study said altering the news feeds was "consistent with Facebook's data use policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research".

While it is not new for internet firms to use algorithms to select content to show to users, social media commentator Jacob Silverman said it is disturbing that Facebook essentially manipulated the sentiments of hundreds of thousands of users without asking permission.

"Facebook cares most about two things: engagement and advertising. If Facebook, say, decides that filtering out negative posts helps keep people happy and clicking, there's little reason to think that they won't do just that,” he told Wired magazine.

“As long as the platform remains such an important gatekeeper – and their algorithms utterly opaque – we should be wary about the amount of power and trust we delegate to it,” said Silverman.

Read more on Facebook

Read more on Privacy and data protection

Search CIO
Security
Search Networking
Search Data Center
Search Data Management
Close