Facebook’s Legal Loopholes

Facebook recently published an analysis about its own users’ tendencies. Based on its internal data, the firm announced that individuals who received sad news in their news feeds tended to respond with sad comments. Conversely, individuals who received happier news tended to demonstrate happier responsive behavior. They referred to this phenomenon as “emotional contagion.”

Surprising? Not really. But a furor arose when the scientific community noticed a rather disconcerting fact about the Facebook analysis. Namely, Facebook had secretly manipulated the news feeds of its users, and had then studied their reactions.

In other words, the firm did not simply observe patterns of emotions that already existed in their users. It explicitly manipulated those emotions without informing their users.

Was this wrong? Facebook supporters might argue that its users don’t really care why certain postings are more prominently displayed in their news feeds than other postings. And by opening Facebook accounts, users agree (by clicking a check box) to permit the firm to modify its news feed display algorithms at any time.

But is it legal to secretly manipulate news feeds in order to study whether the emotions and behavioral reactions of users can be manipulated? Apparently, due to a pair of loopholes in federal law, Facebook committed no crime.

You see, if senior professors at a research university had proposed such a project, they would have been legally obligated to seek the approval of their university’s Institutional Review Board (IRB) before proceeding with the study. Under federal law, any academic research projects involving humans must undergo such reviews in order to protect the rights of participants.

What concerns are scrutinized by IRBs? Failing, for instance, to obtain the “informed consent” of participants prior to the start of research activities. Failing to notify participants that they can withdraw from the study at any time without penalty. Engaging in any type of deception.

Although these are not necessarily fatal flaws in research studies, they most certainly represent “red flags” that draw the attention of any university IRB. They also represent prominent features of Facebook’s research study. But because Facebook is a “private company” and is not a university, its researchers are not required to adhere to federal IRB regulations, and the study proceeded without such oversight.

Interestingly, Facebook did reach out to academic researchers at Cornell University to help it analyze the results and write the published study. But because those academicians joined the project after the “field work” had been concluded, the Cornell University IRB did not maintain jurisdiction over the research activity.

In other words, Facebook benefitted from a pair of loopholes in the federal IRB law. Because Facebook is not a university, and because the firm did not engage academic researchers until it concluded all data collection activities, it avoided any IRB oversight activities.

Nevertheless, from a risk management perspective, the social media giant may indeed be well advised to voluntarily adopt IRB style policies and procedures in the future. By failing to do so, the firm arguably places the well being of its own users in potential jeopardy, and the ethical strength of its corporate reputation as well.

Disclosure: Mike Kraten, the author of this posting and co-publisher of this AQPQ blog, is the Chair of the Institutional Review Board (IRB) of Providence College in Providence, Rhode Island. All opinions expressed in this posting represent his personal opinions; none represent the policies or positions of Providence College.