Facebook runs "emotional contagion" experiment on 600,000 users, without their informed consent, by manipulating their news feeds
We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues.
Here's the method:
On Facebook, people frequently express emotions, which are later seen by their friends via Facebook’s “News Feed” product (8). Because people’s friends frequently produce much more content than one person can view, the News Feed filters posts, stories, and activities undertaken by friends. News Feed is the primary manner by which people see content that friends share. Which content is shown or omitted in the News Feed is determined via a ranking algorithm that Facebook continually develops and tests in the interest of showing viewers the content they will find most relevant and engaging. One such test is reported in this study: A test of whether posts with emotional content are more engaging.
The experiment manipulated the extent to which people (N = 689,003) were exposed to emotional expressions in their News Feed.
(Look, I'm sure this is just an experiment, right? Leaving aside the question of whether N=689,003 implies less "experiment" and more "intervention." For example, in Fort Worth, TX N = 792,727; in Detroit, MI N = 688,701; and for El Paso, TX N = 674,433. "Experiment" or no, these are large numbers -- Silicon Valley squillionaires talk about "scale" all the time, right? So you'd expect the numbers to be large -- and they're large enough to have real world effects.)
This tested whether exposure to emotions led people to change their own posting behaviors, in particular whether exposure to emotional content led people to post content that was consistent with the exposure—thereby testing whether exposure to verbal affective expressions leads to similar verbal expressions, a form of emotional contagion.
Are the designers of this study really so unimaginative -- or so sociopathic -- that they cannot imagine that "posting behaviors" don't reinforce or even create real world behaviors?
The results show emotional contagion. As Fig. 1 illustrates, for people who had positive content reduced in their News Feed, a larger percentage of words in people’s status updates were negative and a smaller percentage were positive. When negativity was reduced, the opposite pattern occurred. These results suggest that the emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks (3, 7, 8), and providing support for previously contested claims that emotions spread via contagion through a network.
Great. We as a society -- heck, as a global society -- have just put Mark Zuckerberg in charge of our emotional well-being.
Textual content alone appears to be a sufficient channel.
Which, when you think about it, any novelist would know. So, why did Facebook think this was OK? "Terms and Conditions," of course!
Posts were determined to be positive or negative if they contained at least one positive or negative word, as defined by Linguistic Inquiry and Word Count software (LIWC2007) (9) word counting system, which correlates with self-reported and physiological measures of well-being, and has been used in prior research on emotional expression (7, 8, 10). LIWC was adapted to run on the Hadoop Map/Reduce system (11) and in the News Feed filtering system, such that no text was seen by the researchers. As such, it was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research. Both experiments had a control condition, in which a similar proportion of posts in their News Feed were omitted entirely at random (i.e., without respect to emotional content). Separate control conditions were necessary as 22.4% of posts contained negative words, whereas 46.8% of posts contained positive words. So for a person for whom 10% of posts containing positive content were omitted, an appropriate control would withhold 10% of 46.8% (i.e., 4.68%) of posts at random, compared with omitting only 2.24% of the News Feed in the negativity-reduced control.
So, it's OK to manipulate my emotions as long as you do the measurement at arms length through an algorithm? Are these people nuts? And have you actually read the Facebook Terms and Conditions (which Facebook can change, at any time, retroactively). Here are Facebook's terms (sorry if the tiny little screen dumps don't align):
Any time you see overweening complexity like that, it's a sign some neoliberal rentier is trying to pick your pocket: Mortgage forms during the housing bubble; IRA or 401k forms, any time; the ObamaCare website. And so on. Kinda makes a nonsense of the idea that future changes will be "opt in," doesn't it? First, who can opt into or out of monstrosities like those? And second, did anybody really think they were opting into being lab rats? I don't think so.
Funny thing, though. I missed the part where Facebook decided to give its
users experimental test subjects an honarium: Say, $20 a pop. $20 * 689,003 comes to $13,780,060 total, or about what Zuckerberg spent on one of the four homes he just bought to ensure his own privacy. Nobody's going to make Zuckerberg a lab rat! I guess his houses are all on one-way streets...
NOTE What would be really funny, and so meta, would be if FB is running the same kind of experiment on its Terms and Conditions.
UPDATE Here is a really interesting study, also from the PNAS, on emotions, the body, and cultural universals. I'm linking to it here so I don't forget it.
When universities conduct studies on people, they have to run them by an ethics board first to get approval — ethics boards that were created because scientists were getting too creepy in their experiments, getting subjects to think they were shocking someone to death in order to study obedience and letting men live with syphilis for study purposes. A 2012 profile of the Facebook data team noted, “ Unlike academic social scientists, Facebook’s employees have a short path from an idea to an experiment on hundreds of millions of people.” I’ve reached out to Facebook to find out what the review process was here. Via @ZLeeily, the PNAS editor on the article says this study did pass muster with an Institutional Review Board, but we’ll see if it passes muster with users.
Ideally, Facebook would have a consent process for willing study participants: a box to check somewhere saying you’re okay with being subjected to the occasional random psychological experiment that Facebook’s data team cooks up in the name of science. As opposed to the commonplace psychological manipulation cooked up advertisers trying to sell you stuff.
While users may have some awareness of the terms they’ve agreed to when signing up, they aren’t fully aware of what their information is being used for. A Consumer Reports survey, for example, found that “only 37 percent of Facebook users say they have used the site’s privacy tools to customize how much information apps are allowed to see.” Those third parties are one of the biggest collectors of data. And, the report explains, this touches everyone, not just those who don’t change their app settings. “Even if you have restricted your information to be seen by friends only, a friend who is using a Facebook app could allow your data to be transferred to a third party without your knowledge.”
“There’s a burden on the individual to get educated, but there’s also a burden on the companies,” Dr. Pamela Rutledge, director of the Media Psychology Research Center, told ThinkProgress earlier this year. “We’re not all lawyers, we’re not all IT guys,” she said.
Caveat emptor, except of course when the product is free, you are the product, so perhaps a different sort of caveat is required.