The company says the test was meant to help improve the service

Jun 30, 2014 07:37 GMT  ·  By

How much does Facebook care about your feelings? Well, the answer is both “very much” and “not at all.” That’s because the world’s largest social network has betrayed everyone’s trust by secretly manipulating the feelings of 700,000 users to understand “emotional contagion” for a study.

Back in 2012, for an entire week and without giving users any warning or getting their consent, Facebook fiddled with the algorithm used to place posts into the news feed to figure out how they affected the users’ mood.

Basically, the social network checked the number of positive or negative words in a message and tried to figure out whether users who read them were affected in any way by them and in what manner.

The study was conducted by researchers from Cornell University and University of California at San Francisco, alongside Facebook and it was actually made public a few weeks ago, but only came to everyone’s attention this past weekend.

“Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness,” the study reads. It follows saying that these results indicate the emotions expressed by others on Facebook influence our own emotions to the point of becoming a “massive-scale contagion via social networks.”

Sure, it’s not exactly uncommon for such studies to be conducted, or for trends to be analyzed, but this time, the researches went one step further and manipulated the data to see if there was any reaction, something that’s a million times more important when you’re talking about a social network that people trust with their information.

What does Facebook have to say about this?

Of course, there was plenty of backlash in the past few hours given the fact that no one explicitly gave Facebook permission to do anything of the kind or to participate in a study like this. While the researchers said that the paper was consistent with Facebook’s Data Use Policy, which all 1.2 billion users have agreed to, it still doesn’t make this right since no one ever reads the terms.

The company is defending itself in front of everyone, saying that the research is meant to improve the service and to make the content that people see on Facebook as relevant and engaging as possible.

“This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account,” the company wrote.

“A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process,” Facebook continued.

Does this make everything right? Probably not, but chances are that the company won’t suffer too much from this. Many have threatened to leave the social network following the scandal, but how many will follow through remains to be seen.

The fact of the matter is that when the NSA revelations were first made available and the entire tech industry was exposed as working with the intelligence agency (even if it was doing this while kicking and screaming), people promised the same – to quit Google, Yahoo, Facebook, Apple, Microsoft and so on. Few are those who actually did it and few are those who will quit Facebook after this latest scandal.