Facebook Emotional Study Draws Federal Scrutiny

A letter concerning Facebook's experiment was sent by a lawmaker to the FTC

  Facebook may be in trouble in the US as well
Facebook is already in trouble for its emotion manipulation study over in Europe and it has already lost a big chunk of its popularity among the platform’s members, but it seems that the troubles aren’t yet over.

Facebook is already in trouble for its emotion manipulation study over in Europe and it has already lost a big chunk of its popularity among the platform’s members, but it seems that the troubles aren’t yet over.

In fact, the social network is under fire in the United States again as Senator Mark Warner filed a formal complaint with the Federal Trade Commission (FTC) in which he questions the experiment’s ethics, transparency and accountability.

“I come from the technology world, and I understand that social media companies are looking for ways to extract value from the information willingly provided by their huge customer base. I don’t know if Facebook’s manipulation of users’ news feeds was appropriate or not. But I think many consumers were surprised to learn they had given permission by agreeing to Facebook’s terms of service. And I think the industry could benefit from a conversation about what are the appropriate rules of the road going forward,” Warner said.

In the letter, he wrote he expressed a concern about whether or not Facebook responsibly assessed the risks and benefits of conducting this behavioral experiment, as well as the ethical guidelines, if any, that were used to protect individuals.

According to a study Facebook’s researchers recently published in a scientific magazine, the company had randomly selected nearly 700,000 users and assessed whether or not one individual’s mood affected others.

This happened as Facebook’s algorithms analyzed whether a message posted by someone was positive or negative by analyzing the words inside.

The entire study wouldn’t have been that outrageous had Facebook not explicitly modified the News Feed algorithm to display more negative or more positive messages to certain users and then analyzing their actions.

The entire experiment was done anonymously, as all other tests performed by Facebook, which means that no one knows whether or not someone in the “negative” group was perhaps suffering from depression, which could have deep implications that no one wants.

Facebook has been defending itself saying that users have agreed to have their data used for this type of activities as soon as they signed up for the account. Of course, no one reads the Terms of Use agreement since they’re normally pretty similar for all services, so the news came as a surprise for many.

On the other hand, online services commonly use people’s data for various studies and tests in order to improve and to give users what they want before they know it. The data can also be used, however, for invaluable reports that get pushed over to marketers, which in the end translates into dollars for the company. This isn’t something particular for Facebook, but rather something we have learned to accept when we use an online service for free.

2 Comments