Corrente

If you have "no place to go," come here!

Facebook runs "emotional contagion" experiment on 600,000 users, without their informed consent, by manipulating their news feeds

From the Proceedings Of The National Academy Of Sciences:

Significance
We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues.

Jeebus!

Here's the method:

On Facebook, people frequently express emotions, which are later seen by their friends via Facebook’s “News Feed” product (8). Because people’s friends frequently produce much more content than one person can view, the News Feed filters posts, stories, and activities undertaken by friends. News Feed is the primary manner by which people see content that friends share. Which content is shown or omitted in the News Feed is determined via a ranking algorithm that Facebook continually develops and tests in the interest of showing viewers the content they will find most relevant and engaging. One such test is reported in this study: A test of whether posts with emotional content are more engaging.

The experiment manipulated the extent to which people (N = 689,003) were exposed to emotional expressions in their News Feed.

(Look, I'm sure this is just an experiment, right? Leaving aside the question of whether N=689,003 implies less "experiment" and more "intervention." For example, in Fort Worth, TX N = 792,727; in Detroit, MI N = 688,701; and for El Paso, TX N = 674,433. "Experiment" or no, these are large numbers -- Silicon Valley squillionaires talk about "scale" all the time, right? So you'd expect the numbers to be large -- and they're large enough to have real world effects.)

This tested whether exposure to emotions led people to change their own posting behaviors, in particular whether exposure to emotional content led people to post content that was consistent with the exposure—thereby testing whether exposure to verbal affective expressions leads to similar verbal expressions, a form of emotional contagion.

Are the designers of this study really so unimaginative -- or so sociopathic -- that they cannot imagine that "posting behaviors" don't reinforce or even create real world behaviors?

The results show emotional contagion. As Fig. 1 illustrates, for people who had positive content reduced in their News Feed, a larger percentage of words in people’s status updates were negative and a smaller percentage were positive. When negativity was reduced, the opposite pattern occurred. These results suggest that the emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks (3, 7, 8), and providing support for previously contested claims that emotions spread via contagion through a network.

Great. We as a society -- heck, as a global society -- have just put Mark Zuckerberg in charge of our emotional well-being.

Textual content alone appears to be a sufficient channel.

Which, when you think about it, any novelist would know. So, why did Facebook think this was OK? "Terms and Conditions," of course!

Posts were determined to be positive or negative if they contained at least one positive or negative word, as defined by Linguistic Inquiry and Word Count software (LIWC2007) (9) word counting system, which correlates with self-reported and physiological measures of well-being, and has been used in prior research on emotional expression (7, 8, 10). LIWC was adapted to run on the Hadoop Map/Reduce system (11) and in the News Feed filtering system, such that no text was seen by the researchers. As such, it was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research. Both experiments had a control condition, in which a similar proportion of posts in their News Feed were omitted entirely at random (i.e., without respect to emotional content). Separate control conditions were necessary as 22.4% of posts contained negative words, whereas 46.8% of posts contained positive words. So for a person for whom 10% of posts containing positive content were omitted, an appropriate control would withhold 10% of 46.8% (i.e., 4.68%) of posts at random, compared with omitting only 2.24% of the News Feed in the negativity-reduced control.

So, it's OK to manipulate my emotions as long as you do the measurement at arms length through an algorithm? Are these people nuts? And have you actually read the Facebook Terms and Conditions (which Facebook can change, at any time, retroactively). Here are Facebook's terms (sorry if the tiny little screen dumps don't align):




But wait! There's more! Here's the privacy policy:



Any time you see overweening complexity like that, it's a sign some neoliberal rentier is trying to pick your pocket: Mortgage forms during the housing bubble; IRA or 401k forms, any time; the ObamaCare website. And so on. Kinda makes a nonsense of the idea that future changes will be "opt in," doesn't it? First, who can opt into or out of monstrosities like those? And second, did anybody really think they were opting into being lab rats? I don't think so.

Funny thing, though. I missed the part where Facebook decided to give its users experimental test subjects an honarium: Say, $20 a pop. $20 * 689,003 comes to $13,780,060 total, or about what Zuckerberg spent on one of the four homes he just bought to ensure his own privacy. Nobody's going to make Zuckerberg a lab rat! I guess his houses are all on one-way streets...

NOTE What would be really funny, and so meta, would be if FB is running the same kind of experiment on its Terms and Conditions.

UPDATE Here is a really interesting study, also from the PNAS, on emotions, the body, and cultural universals. I'm linking to it here so I don't forget it.

UPDATE Forbes zeroes in on the informed consent issue:

When universities conduct studies on people, they have to run them by an ethics board first to get approval — ethics boards that were created because scientists were getting too creepy in their experiments, getting subjects to think they were shocking someone to death in order to study obedience and letting men live with syphilis for study purposes. A 2012 profile of the Facebook data team noted, “ Unlike academic social scientists, Facebook’s employees have a short path from an idea to an experiment on hundreds of millions of people.” I’ve reached out to Facebook to find out what the review process was here. Via @ZLeeily, the PNAS editor on the article says this study did pass muster with an Institutional Review Board, but we’ll see if it passes muster with users.

Ideally, Facebook would have a consent process for willing study participants: a box to check somewhere saying you’re okay with being subjected to the occasional random psychological experiment that Facebook’s data team cooks up in the name of science. As opposed to the commonplace psychological manipulation cooked up advertisers trying to sell you stuff.

UPDATE Think Progress focuses on the Terms:

While users may have some awareness of the terms they’ve agreed to when signing up, they aren’t fully aware of what their information is being used for. A Consumer Reports survey, for example, found that “only 37 percent of Facebook users say they have used the site’s privacy tools to customize how much information apps are allowed to see.” Those third parties are one of the biggest collectors of data. And, the report explains, this touches everyone, not just those who don’t change their app settings. “Even if you have restricted your information to be seen by friends only, a friend who is using a Facebook app could allow your data to be transferred to a third party without your knowledge.”

Advocates have said one way to combat this absolute lack of knowledge — and to avoid being unwittingly used in a psychological experiment — is to mandate that tech companies put their terms of use in “plain English.”

“There’s a burden on the individual to get educated, but there’s also a burden on the companies,” Dr. Pamela Rutledge, director of the Media Psychology Research Center, told ThinkProgress earlier this year. “We’re not all lawyers, we’re not all IT guys,” she said.

Caveat emptor, except of course when the product is free, you are the product, so perhaps a different sort of caveat is required.

0
No votes yet
Updated: 

Comments

mellon's picture
Submitted by mellon on

Its time to start creating open alternatives to closed systems like facebook. They all are probably being manipulated like this. Also, a open system people will know what is there in terms of privacy. Openness.. techniques to stop spamming and astroturfing.

(some private web sites have become completely controlled by astroturfers)

Its really important to do this - soon - for the preservation of our country's democracy..

mellon's picture
Submitted by mellon on

I wonder, has anybody ever tried a web community where everybody was an equal owner?

There is a lot to be said for the usefulness of creating different kinds of structures as sort of thought experiments to see what works the best. Say you start out with 100 people who decide, we're going to try to build a number of different structures to see which one seems the most natural. And then do it.

quixote's picture
Submitted by quixote on

Well, to be fair about the psych experiment aspect of all this, the researchers didn't know beforehand that the emotive words would have any effect.

I know. I know. *Anybody* knows that words have power. But it's pretty much an article of faith in psych departments that we're all independent rational thinkers who make up our own minds about everything. No, really. That's what they assume. That's why nobody is horrified about ads, especially political ads. It also makes you wonder how psych departments explain all the money spent on ads, but that's another whole can of worms.

So, anyway, you'll notice the kind of surprised tone in their conclusions. "ZOMG. People actually went and changed their mood based on atmospherics."

Now, will everybody follow this to the logical conclusion and make all the subliminal suggesting illegal? Starting with advertising? I shall watch our future progress with considerable interest.

Submitted by lambert on

... is what? Less than 200 year old? I'd get rid of it without a qualm. Make "tampering with neural tissue" for profit a crime under the the major heading of "Organic Damage."

quixote's picture
Submitted by quixote on

"make it a crime."

Absolutely. People have zero clue how much that stuff affects them (and the rare research that looks shows how much it does, eg 1, 2, 3, 4. And, worst of all for democracies, a linear relationship between amount spent on campaign ads and election outcomes: Silveira and De Mello, 2011.)

Do I feel strongly about this? Yes. Yes, I do!

mellon's picture
Submitted by mellon on

Will the government leverage the ignorance of most people, and use the "emergency" powers granted them after 9-11, to pro-actively condemn and isolate online the people who know inconvenient, disturbing things?

They could get buy-in from the large sites by promising them lots and lots of money to implement these sandboxes. "virtual concentration camps" for people who had been put onto a national shadowban or hellban list.

They would justify it by using words like "contagion".

Submitted by Dromaius on

I'm on FB as my dog (see profile pic here which is the same one I've used there ;-) ), so FB apparently knows my dog's emotions ;-). Of course, I'm only friends with REAL LIFE friends and family, people who actually know me so we spoke about the "friendship" ahead of time.

FB would work fine for us if we all changed to using phony profiles. Of course, it means we wouldn't be able to collect 4999 nearest and dearest friends because most people wouldn't know us, but that's actually probably better in a lot of ways. If we all did this, we would reap the benefits without much of the detrimental loss of privacy.