Facebook Has All the Power

Amid growing calls for formal investigations into Facebook's disturbing mood manipulation research, media scholar Jay Rosen has a reminder for journalists, editors, and personal social media users alike: "Facebook has all the power. You have almost none."

The experiment, conducted without users' knowledge or consent, manipulated the News Feeds of nearly 700,000 Facebook users with the purpose of testing mood responses to content alteration.

The Federal Trade Commission is considering two formal complaints about the 2012 Facebook research, published contentiously in the Proceedings of the National Academy of Sciences journal—one from US Senator Mark Warner and another from the Electronic Privacy Information Centre. Meanwhile, in the UK, the Information Commissioner's Office is investigating.

I caught up with Rosen, who teaches Journalism at NYU and is also an advisor to First Look Media.

* * *

As a journalism and media studies professor, what do you think universities need to do to ensure higher ethical standards are applied to human research targeting social networks?

The problem is that scholars covet thy neighbor's data. They're attracted to the very large and often fascinating data sets that private companies have developed. And for good reason. There could be a lot of discoveries hidden in there! It's the companies that own and manage this data. The only standards we know they have to follow are in the terms-of-service that users accept to create an account, and the law as it stands in different countries.

I believe it was the "sexiness" of the Facebook data that led Cornell University and the Proceedings of the National Academy of Sciences (PNAS) into an ethically dubious arrangement, where, for example, Facebook's unreadable 9,000-word terms-of-service are said to be good enough to meet the standard for "informed consent."

When the study drew attention and controversy, there was a moment when they both could have said: "We didn't look carefully enough at this the first time. Now we can see that it doesn't meet our standards." Instead they allowed Facebook and the PR people to take the lead in responding to the controversy. The moment for reflection was lost. Cornell (home for the two scholars who collaborated with Facebook) is saying it did everything right. PNAS is saying it has "concerns," but it too did everything right.

While we know that Facebook has access to our content and our rights as users are increasingly being diminished, there is something particularly creepy about the knowledge that, without our consent, in an Orwellian fashion, we can be unwitting participants in psychological experimentation. What should this reality signal to Facebook users? Is it time to pull-back?

You have (almost) no rights. You have (almost) no control. You have no idea what they're doing to you or with you. You don't even know who's getting the stuff you are posting, and you're not allowed to know. Trade secret! As the saying goes: "If you're not paying for the product, you are the product." As long as you understand and accept all that, then proceed. With caution.