Running a research would have been ok, manipulating people's emotions is not

Jun 30, 2014 23:17 GMT  ·  By

A couple of years ago, Facebook silently ran an experiment on the platform to see if emotional states could be transferred to others via emotional contagion, which basically means that it analyzed your status update, figured out if it was sending a positive or negative message and if others were influenced by your mood.

This type of studies aren’t exactly new, albeit they are now applied to social media to keep up with the times. They’re not even illegal since we all pretty much signed over our rights to argue with Facebook the second we agreed to have an account on the platform. After all, its Data Use Policy clearly states that Facebook can use our information for such purposes.

The issue with this study wasn’t that it was conducted, because this type of thing takes place all the time. No, the problem is that Facebook allowed the researchers to manipulate the data to see what types of reactions they’d get from the others. Nearly 700,000 people were involved in the study without having prior knowledge of it and any of us can be on that list.

So the question is – was any of this moral? I say “nay” and it looks like I’m not the only one sharing this opinion.

Why? Because Facebook already collects our data and runs it through the grinder. Because advertisers already know everything about us even if they don’t know who we are because the social network handles the intermediary steps. Because not one of these users gave Facebook the permission to be subject to a psychological experiment and because no one was even asked if they wanted to participate.

Of course, Facebook will say that telling anyone they were going to be part of such an experiment would have ruined the whole thing because no one would have reacted as they normally do, they’d have controlled it, reigned it in and everything would have been in vain.

And they’d pretty much be right about this, because that’s human nature. Tell someone they’re being recorded when they’re not and they’ll still mind their body language, refrain from expressing themselves as they would normally and so on. It’s the same issue with this experiment.

“We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook. We didn't clearly state our motivations in the paper,” said one of Facebook’s employees who helped run the study.

So, basically, all they care about is that people might avoid logging on to Facebook if their friends have a bad day. They weren’t actually even interested in figuring out how bad these people were feeling.

“The goal of all of our research at Facebook is to learn how to provide a better service,” writes Adam Kramer. How exactly Facebook intends to counterbalance the bad days your friends have while you’re surfing the network remains unexplained. Perhaps you’ll see more cute cat videos magically appear if your best buddy whined about having to endure waiting for the bus for an extra five minutes.

The bottom line is that it’s wrong to play with people’s feelings, something that’s valid both in real life and online. Manipulating people to view only positive or negative emotional posts for a week is deeply wrong because you have no idea how these could influence other people.

For instance, someone that’s battling depression may sink even deeper if being unfortunate enough to be on the second group, the one that only sees the negative thoughts.

This was Facebook playing Russian roulette with perhaps half of the 700,000 users it included in the study and that’s highly disturbing.