106.7 The Fan All News 99.1 WNEW CBS Sports Radio 1580

Facebook Secretely Involved Thousands of Users In Massive Social Experiment

View Comments
The House of Representatives passed a late-night vote on Thursday to cut funding to two of the National Security Agency’s most controversial practices: Warrantless collection of Americans’ online data and the installation of surveillance “backdoors” on commercial tech products -- a measure being applauded by tech companies and privacy advocates. (Photo by Chris Jackson/Getty Images)

The House of Representatives passed a late-night vote on Thursday to cut funding to two of the National Security Agency’s most controversial practices: Warrantless collection of Americans’ online data and the installation of surveillance “backdoors” on commercial tech products — a measure being applauded by tech companies and privacy advocates. (Photo by Chris Jackson/Getty Images)

Latest News

Get Breaking News First

Receive News, Politics, and Entertainment Headlines Each Morning.
Sign Up

LANHAM, Md. (WNEW) — The news feeds of nearly 700,000 Facebook users were manipulated as part of a psychological experiment about “emotional contagion” in January 2012, and no one knew it until the results were published in a recent issue of the Proceedings of the National Academy of Sciences.

“Emotional contagion is well established in laboratory experiments, with people transferring positive and negative emotions to others,” the abstract of the study says. But researchers wanted to find out if that transfer could also happen online.

The study was basically constructed to see if a decrease in positive expressions on users’ news feeds caused them to produce fewer positive posts and more negative posts themselves, and vice versa. So researchers manipulated the extent to which people were exposed to emotional expressions on their feeds.

Turns out, the manipulation did have an effect, the researchers say. But now that Facebook users have learned about the experiment, their primary emotion is anger.

Adam Kramer, a Facebook data scientist and a co-author of the study, took to his own Facebook page Sunday to publish an explanation of his methods.

“The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product,” he wrote. “We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook. We didn’t clearly state our motivations in the paper.”

“In hindsight, the research benefits of the paper may not have justified all of this anxiety,” he went on to say.

Many took to another social media site — Twitter — to lament about the manipulation of feeds, calling it “unethical” and saying that Facebook treated its users like “lab rats.”

Others said they weren’t surprised to learn of the experiment, or that it wouldn’t have shocked so many people if they had actually read Facebook’s Terms of Service upon signing up.

What do you think of the experiment?

View Comments
blog comments powered by Disqus
Follow

Get every new post delivered to your Inbox.

Join 1,568 other followers