PNAS says the practices involved in the study are a reason for concern

Jul 4, 2014 09:00 GMT  ·  By

Not even the scientific journal that published Facebook’s emotion manipulation study is certain that the company did the right thing.

Proceedings of the National Academy of Sciences, the journal in question, stopped short of retracting the study from its pages, but the publication did have some words to say about the experiment. More specifically, the journal was concerned that the social media giant did not follow scientific ethics and principles of informed consent.

The National Academy of Sciences states that it normally publishes experiments that have allowed subjects to opt out of research. Facebook will of course state that it has done exactly that by giving users the option to have an account… or not. Technically speaking, Facebook does lay everything down for users in its data usage documents, the very same that everyone checks before signing up for an account, but no one ever reads.

Therefore, users do have the possibility to opt out of it all at the cost of not having a Facebook account to start off with.

“Based on the information provided by the authors, PNAS editors deemed it appropriate to publish the paper. It is nevertheless a matter of concern that the collection of the data by Facebook may have involved practices that were not fully considered with the principles of obtaining informed consent and allowing participants to opt out,” the journal said.

According to the publication, when scientists run such researches, they have to obtain informed consent and allow subjects to opt out, something that’s known as the Common Rule. As a private company, however, Facebook has no obligation to conform to such provisions when it collects data from users, and the “Common Rule does not preclude their use of data.”

In short, there’s nothing that can be done against Facebook, since all users agreed to the documentation.

It looks like the Cornell University reviewers had expressed concerns before the publication of the research that the experiment did not fall under the human research protection program of the US government.

Proceedings of the National Academy of Sciences made these statements after receiving a formal complaint filed by privacy activists to US regulators. The Electronic Privacy Information Center said the highly criticized study deceived consumers and violated an agreement on privacy settings.

“At the time of the experiment, Facebook did not state in the Data Use Policy that user data would be used for research purposes. Facebook also failed to inform users that their personal information would be shared with researchers. Moreover, at the time of the experiment, Facebook was subject to a consent order with the Federal Trade Commission which required the company to obtain users’ affirmative express consent prior to sharing user information with third parties,” reads the complaint.

EPIC states that Facebook’s conduct is both a deceptive trade practice under Section 5 of the FTC Act and a violation of the Commission’s 2012 Consent Order.

The experiment in question involves Facebook analyzing the words in posts and determining whether they are positive or negative. Some 700,000 people were involved in the study.

The controversial part is the fact that Facebook split the group into two parts and fiddled with the newsfeed algorithm. Some of them received predominantly negative messages, while the others received more positive ones. The social network was trying to figure out whether the emotional contagion applies to social media or not.

Basically, Facebook manipulated users’ emotions and hopefully no one that was clinically depressed was part of the group receiving mostly negative messages.