© 2024 KGOU
News and Music for Oklahoma
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Facebook Manipulates Our Moods For Science And Commerce: A Roundup

Facebook researchers manipulated newsfeeds of nearly 700,000 users to study "emotional contagion."
iStockPhoto
Facebook researchers manipulated newsfeeds of nearly 700,000 users to study "emotional contagion."

So, that happened.

Scientists published a paper revealing that in 2012, Facebook researchers conducted a study into "emotional contagion." The social media company altered the news feeds (the main page users land on for a stream of updates from friends) of nearly 700,000 users. Feeds were changed to reflect more "positive" or "negative" content, to determine if seeing more sad messages makes a person sadder.

The bottom line is news feeds were tweaked without warning because Facebook users agreed to the social giant's general terms of data use, and researchers tracked emotional responses of test subjects by judging any subsequent changes in their use of language. It's unclear if you, or I, were tested. As users, the check-box agreement gave permission for this kind of psychological experimentation.

If that isn't bleak enough, we've reported previously that in a separate study, University of Michigan researchers found the very existence of feeds was making some users sadder.

The Internet is overwhelmingly outraged. "Even the Editor of Facebook's Mood Study Thought It Was Creepy," Adrienne LaFrance wrote, at The Atlantic. If you're just catching up, here are a few reads to consider:

New Statesman: Facebook can manipulate your mood. It can affect whether you vote. When do we start to worry?

Laurie Penny explains that the study's findings are not the point — that Facebook did this is the point — and argues the potential for more is why the research feels so wrong.

"I am not convinced that the Facebook team knows what it's doing. It does, however, know what it can do — what a platform with access to the personal information and intimate interactions of 1.25 billion users can do. ...

"What the company does now will influence how the corporate powers of the future understand and monetise human emotion. Dr Adam Kramer, the man behind the study and a longtime member of the company's research team, commented in an excited Q & A that 'Facebook data constitutes the largest field study in the history of the world.' The ethics of this situation have yet to be unpacked."

Forbes: Facebook Doesn't Understand The Fuss About Its Emotion Study

Reporter Kashmir Hill has been aggressively reporting this story for Forbes and got a response from Facebook that stipulated that the research was conducted for a single week and none of the data used were associated with any specific user. The Facebook response continues:

"We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it's positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people's data in connection with these research initiatives and all data is stored securely."

Meanwhile, over on Hacker News, there's a lively debate on whether the response is overblown. You can check out the debate, but its premise is a thought from venture capitalist and early Internet pioneer Marc Andreessen:

And finally, our pop culture writer Linda Holmes weighed in this morning, in her piece, Lab Rats One And All: That Unsettling Facebook Experiment. She closes with a practical suggestion for Facebook:

"If this kind of experimentation is really OK, if it's really something they believe is within their everyday operations and their existing consent, all they have to do is clarify it. Give people a chance to say yes or no to research that is psychological or sociological in nature that involves not the anonymized use of their data after the fact but the placing of users in control and experimental groups. Just get 'em to say yes or no. If it's really not a big deal, they'll say yes, right? It really seems like a pretty reasonable request."

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Elise Hu is a host-at-large based at NPR West in Culver City, Calif. Previously, she explored the future with her video series, Future You with Elise Hu, and served as the founding bureau chief and International Correspondent for NPR's Seoul office. She was based in Seoul for nearly four years, responsible for the network's coverage of both Koreas and Japan, and filed from a dozen countries across Asia.
More News
Support nonprofit, public service journalism you trust. Give now.