Subscribe to Inc. magazine
DATA DETECTIVES

Facebook's Psychological Experiment Manipulated More Than 600,000 Users

The company ran a massive psychological experiment, confirming that users are affected by what they see on their newsfeed.
Advertisement

Facebook's data scientists conducted a massive experiment where it messed with people's feeds and proved that longer-lasting moods, like happiness or depression, can be transferred across the social network.  

The company tweaked the Newsfeed algorithms of 689,003 unwitting Facebook users, so that people were seeing an abnormally low number of either positive or negative posts. 

In a recently published study, the scientists say they found that when people saw fewer positive posts on their feeds, they produced fewer positive posts and instead wrote more negative posts. On the flip side, when scientists reduced the number of negative posts on a person's newsfeed, those individuals became more positive themselves.

"Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness," study authors Adam Kramer, Jamie Guillory, and Jeffrey Hancock write. "We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues."

This idea is interesting in and of itself, but the AV Club's William Hughes also points out that the study highlights something that most users probably don't think about: By agreeing to the Facebook's Data Use Policy when you sign up, you're automatically giving it permission to include you in big psychological experiments like this, without your knowledge. 

Facebook says it does research like this experiment to figure out how to make the content people see on Facebook as relevant as possible. A spokesperson sent us the following comment:

"This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account. We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it's positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people's data in connection with these research initiatives and all data is stored securely."

(Hat-tip to Rami Ismail who tweeted the study.)

-This story originally appeared on Business Insider

Last updated: Jun 30, 2014




Register on Inc.com today to get full access to:
All articles  |  Magazine archives | Livestream events | Comments
EMAIL
PASSWORD
EMAIL
FIRST NAME
LAST NAME
EMAIL
PASSWORD

Or sign up using: