advertisement

Is Facebook really turning us into lab rats?

How shocking: Facebook had the temerity to conduct an experiment on its users without telling them and now the results have been published in the Proceedings of the U.S. National Academy of Sciences. Actually, no one should be surprised.

For a week in 2012, the social network's staff scientist Adam Kramer and two collaborators used algorithms to doctor the news feeds of 689,003 English-speaking Facebook users. They reduced the number of posts containing "positive" and "negative" words, tracked their lab rat users' own posts, and found that their mood was influenced by that of the news feed. The term, well-known to psychologists studying real-world communications, is "emotional contagion."

If Kramer's weeklong experiment appears outrageous, what Facebook does on a daily basis is monstrous. An algorithm called EdgeRank scores each post on a number of criteria; such as how frequently a News Feed owner interacts with its author and the quality of that interaction (a comment is more valuable than a "like"). The higher-ranked posts go to the top of the feed. That's why a typical user doesn't see everything her friends are posting - just what Facebook decides she'd be interested in seeing, plus paid advertising (which is also supposed to be targeted). You can tweak the settings to make posts appear in their "natural" order, but few people bother to do it, just as hardly anyone ever reads Facebook's data use policy: buried among these 9000 words, there is a sentence that says research is a legitimate use.

In other words, on Facebook, one can opt out of having a machine decide what content you will find engaging. Twitter, by contrast, allows users to opt in by using the so-called Discover feed. I find the opt-in tactic more honest, but, predictably, it's less effective from a marketing point of view.

Facebook manipulates what its users see as a matter of policy. Academics may discuss whether the users give their informed consent to such use of their data, but common experience suggests that a lot of people choose to stay uninformed. Others realize they are being tracked, and experimented upon, by the likes of Facebook and Google and don't mind. It's par for the course on the social web. "Run a website, measure anything, make any changes based on measurements? Congratulations, you're running a psychology experiment!" venture capitalist Marc Andreessen, who sits on Facebook's board, tweeted.

People who hate this have the option of not using Facebook and switching to a network that is less invasive in its attempts to leverage its user base. It's like unplugging that TV or quitting smoking: so easy, and yet so hard.

The good news for those who want their social networking fix regardless is that Facebook only has a limited ability to influence our emotions. "At the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it - the result was that people produced an average of one fewer emotional word, per thousand words, over the following week," Kramer explained in a Facebook post on Sunday.

Kramer, as the person who designed the experiment, is actually overoptimistic about it. The algorithm used to check for positive and negative words, Linguistic Inquiry and World Count, is one of many primitive attempts to map natural language onto mathematical criteria. These programs see the word "happy" as positive and the word "sad" as negative, but they have no ability to detect sarcasm, which should be essential in analyzing social network content, and they have a limited grasp of grammatical constructs that sometimes turn word meanings on their head. Psychologist John Grohol wrote that LIWC would assess the sentence "I am not having a great day" as neutral because it has both "great" and "not".

As with a lot of Facebook data, which the enormous network has no ability to police, it's garbage in, garbage out.

If the U.S. Army Research office actually took part in funding the research - as Cornell University, whose Jamie Guillory was one of Kramer's co-authors, initially reported (and then corrected itself to say there had been no external funding) - the military did not get access to a powerful mood-changing tool. People may be lazy about reading policy documents and resisting corporate manipulation, but they are more complicated than their Facebook accounts.

Article Comments
Guidelines: Keep it civil and on topic; no profanity, vulgarity, slurs or personal attacks. People who harass others or joke about tragedies will be blocked. If a comment violates these standards or our terms of service, click the "flag" link in the lower-right corner of the comment box. To find our more, read our FAQ.