The UK’s privacy watchdog is to investigate if Facebook broke data protection laws by conducting a psychological experiment on users without their consent.
Two US universities were involved in the experiment, conducted one week in 2012, in which Facebook filtered 689,000 users’ newsfeeds to study the effects on users’ emotions.
One test reduced users' exposure to their friends' "positive emotional content", resulting in fewer positive posts of their own.
Another test reduced exposure to "negative emotional content”, resulting in fewer negative posts by those selected for the test.
The publication of the report unleashed criticism mainly of the way the research was conducted – particularly the lack of consent – and raised concerns over potential impact of such studies.
The UK’s Information Commissioner's Office (ICO) now plans to question Facebook over the study, according to the Financial Times.
The paper quoted an ICO spokesman as saying it was too early to tell exactly what part of the law Facebook may have infringed.
The ICO said it would contact the data protection regulator in Ireland, because Facebook has its European headquarters in Dublin.
Consent for thought-control experiment
Read more about Facebook
- Facebook under fire over secret experiment on users
- Facebook changes privacy settings again
- Facebook still dominates social media
- Top European court to rule on NSA Facebook data privacy challenge
- Facebook to move into banking as consumers seek more choice
- Facebook faces lawsuit over monitoring private messages
Labour MP Jim Sheridan, a member of the Commons media select committee, has called for a parliamentary investigation into how Facebook and other social networks manipulate emotional and psychological responses of users by editing information supplied to them.
"If people are being thought-controlled in this kind of way, there needs to be protection and they at least need to know about it,” Sheridan told the Guardian.
Facebook has defended the experiment by saying there was "no unnecessary collection of people's data" and that “none of the data used was associated with a specific person's Facebook account".
Richard Allen, Facebook’s director of policy in Europe, said in a statement that the social networking firm was happy to answer any of the regulators' questions.
However, he said it was clear that people are upset by the study. "We want to do better in the future and are improving our process based on this feedback.”
Facebook data scientist Adam Kramer – who co-authored the report on the research – said the social networking firm felt it was important to investigate concerns that seeing friends post positive content leads to people feeling negative or left out.
Facebook was also concerned that exposure to friends' negativity might lead people to avoid visiting the site, he said, but admitted that the firm did not "clearly state our motivations in the paper".
"I can understand why some people have concerns about it, and my co-authors and I are very sorry for the way the paper described the research and any anxiety it caused," said Kramer.
Retrospective terms and conditions
In May 2012, Facebook changed its terms and conditions to allow data to be used for research, but the change was made about four months after it conducted the mood influencing experiment, reports the Guardian.
Facebook added a clause granting the firm the right to use information about its customers "for internal operations, including… research".
But the version of the document in force in January 2012, when the psychology experiment was conducted, did not include research.
The change was made after Facebook settled a complaint from the Federal Trade Commission (FTC), after it was accused of "unfair and deceptive" privacy practices.
In a statement to the Guardian, the social networking firm said: “When someone signs up for Facebook, we’ve always asked permission to use their information to provide and enhance the services we offer. To suggest we conducted any corporate research without permission is complete fiction.