Facebook under fire over secret experiment on users

Facebook has come under fire for conducting a psychology experiment on 689,000 users without their consent

Facebook has come under fire for conducting a psychology experiment on 689,000 users without their consent.

Cornell University and the University of California were involved in the experiment conducted one week in 2012 in which Facebook filtered users’ newsfeeds to study the effects on users’ emotions.

One test reduced users' exposure to their friends' "positive emotional content", resulting in fewer positive posts of their own.

Another test reduced exposure to "negative emotional content”, resulting in fewer negative posts by those selected for the test.

The study concluded: "Emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks."

Facebook has defended the experiment by saying there was "no unnecessary collection of people's data" and that “none of the data used was associated with a specific person's Facebook account."

But publication of the study report has unleashed criticism mainly of the way the research was conducted and raised concerns over the impact such studies could have, reports the BBC.

The study has also raised fears that the process could be used for political purposes or to boost social media advertising revenues.

Labour MP Jim Sheridan, a member of the Commons media select committee has called for a parliamentary investigation into how Facebook and other social networks manipulate emotional and psychological responses of users by editing information supplied to them.

"This is extraordinarily powerful stuff and if there is not already legislation on this, then there should be to protect people," he told the Guardian.

"They are manipulating material from people's personal lives and I am worried about the ability of Facebook and others to manipulate people's thoughts in politics or other areas.

"If people are being thought-controlled in this kind of way there needs to be protection and they at least need to know about it,” said Sheridan.

According to Facebook data scientist Adam Kramer, who co-authored the report on the research, the social networking firm felt that it was important to investigate concerns that seeing friends post positive content leads to people feeling negative or left out.

Facebook was also concerned that exposure to friends' negativity might lead people to avoid visiting the site, but Kramer admitted that the firm did not "clearly state our motivations in the paper".

"I can understand why some people have concerns about it, and my co-authors and I are very sorry for the way the paper described the research and any anxiety it caused,” he said in a statement.

The study said altering the news feeds was "consistent with Facebook's data use policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research".

While it is not new for internet firms to use algorithms to select content to show to users, social media commentator Jacob Silverman said it is disturbing that Facebook essentially manipulated the sentiments of hundreds of thousands of users without asking permission.

"Facebook cares most about two things: engagement and advertising. If Facebook, say, decides that filtering out negative posts helps keep people happy and clicking, there's little reason to think that they won't do just that,” he told Wired magazine.

“As long as the platform remains such an important gatekeeper – and their algorithms utterly opaque – we should be wary about the amount of power and trust we delegate to it,” said Silverman.

Read more on Facebook

CW+

Features

Enjoy the benefits of CW+ membership, learn more and join.

Read more on Privacy and data protection

Join the conversation

3 comments

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

This really is an outrageous misuse of personal information, whether anonymised or not and whether within Facebook's data use policy or not. In fact the data use policy must be challenged because of this. Facebook cannot be trusted. This is nothing short of mind manipulation, and I fully endorse and support Jim Sheridan MP in his call for an investigation and also legislation to prevent this sort of corporate stupidity. As an Information Security Consultant I am constantly advising people to be careful what they put on social media sites, but this is a whole new ball game and must stop. We've had the Snowden leaks and know that trust in governments, East, West and everywhere in between has been lost. This though takes manipulation of masses one step closer, and must be challenged vigorously.

Cancel

Facebook is the enemy to humans. It is a spying machine, creates antisocial behavior, and promotes oppression. Why are people stupid enough to use the garbage system?

Cancel

Well our Government already has the power, so thy will try to regulate any other form of socialism trying to use it. Look we all posted here and sure we are being monitored :)

Cancel

-ADS BY GOOGLE

SearchCIO

SearchSecurity

SearchNetworking

SearchDataCenter

SearchDataManagement

Close