Facebook argues, however, that this was covered when users signed up for an account

Jul 1, 2014 07:27 GMT  ·  By

Nearly 700,000 people were secretly involved in a Facebook experiment a couple of years ago, but the social network only modified its data use policy to include “research” four months after the study took place.

Facebook’s experiment took place in January 2012, but the data use policy that the company says the study is in accordance with was only updated in May 2012. In the newly added information, Facebook mentioned that data might be used “for internal operations, including troubleshooting, data analysis, testing, research and service improvement.”

As many companies do when they update their policies, Facebook even made sure to create a redline version to contrast the two versions and make sure that people know what information was cut out and what information was added. So, the May 2012 version of the data use policy was put against the September 2011 one and it is made clear that research, testing, data analysis and the rest were not previously included.

A Facebook spokesperson has said that upon signing up for Facebook, people are asked for permission to use their information to enhance the service. “To suggest we conducted any corporate research without permission is complete fiction. Companies that want to improve their services use the information their customers provide, whether or not their privacy policy uses the word ‘research’ or not,” Facebook said about the issue.

Adam Kramer, Facebook data scientist who worked on the experiment where people’s emotions were manipulated, explained following the outcry that the company wanted to investigate whether seeing friends post positive content led to people feeling negative or left out.

However, he did add that they were also concerned that exposure to friends’ negativity might lead to people avoiding Facebook, which is actually the purpose of all this. Basically, the social network wanted to find a way to make sure people that didn’t avoid visiting the site even when their friends went through rough times.

If you haven’t found out yet, Facebook ran an experiment back in January 2012 when, for an entire week, nearly 700,000 people’s feeds were manipulated to display a certain type of content. The posts from their friends were scanned for various words and the algorithm decided whether they were positive or negative.

Out of the entire group of people, part got considerably more positive posts on their newsfeeds, while the others got the negative ones. Then, Facebook analyzed the way the subjects interacted with the platform and figured out that emotional contagion was a real issue on the social network.