When you log into Facebook, how much do you trust what you see? How would you feel if you found out that the newsfeeds had been intentionally altered and that you were part of a huge experiment?
For one week in January 2012, Facebook intentionally altered newsfeeds for nearly 700,000 Facebook users as part of an emotional contagion experiment. Some of the users saw newsfeeds that contained a number of positive and happy words while others saw newsfeeds that were rife with negative and sadder words.
At the end of the week, Facebook analyzed how their users responded to the adjusted newsfeeds. They looked to see if the users who viewed the more positive feeds had a tendency to use and post items with more positive and happier words and vice versa for those who spent a week viewing the downer posts.
This type of follow-up reaction is called emotional contagion. The general concept is that when you are exposed to more positive and pleasant things, you tend to be more positive and pleasant yourself. When constantly exposed to more negative and unpleasant things, you tend to become one of those people that many of us try to ignore.
In fact, that’s exactly what the experiment discovered. Facebook users that were fed the positive newsfeeds for a week tended to post more positive content on their Facebook page. Facebook users who were fed the negative newsfeeds posted more negative content on their Facebook page.
They concluded that emotional states in people can be transferred to other people via emotional contagions. Furthermore, most people are completely unaware that their emotional state is being altered by others, even without direct interaction.
Is what Facebook did legal? Yep! If you ever took time to read all of the fine print you agreed to when you set up your Facebook account, you will have read that under their terms of service, you give them the right to your data, postings photos, etc. for data testing and research.
Is what Facebook did ethical? That is still an ongoing argument that some are following up on. Robinson Meyer with The Atlantic is one of those tracking the ethical argument of the Facebook experiment. He wrote:
“We’re tracking the ethical, legal, and philosophical response to this Facebook experiment here. We’ve also asked the authors of the study for comment. Author Jamie Guillory replied and referred us to a Facebook spokesman. Early Sunday morning, a Facebook spokesman sent this comment in an email:”
“‘This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account. We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.’”
“And on Sunday afternoon, Adam D.I. Kramer, one of the study’s authors and a Facebook employee, commented on the experiment in a public Facebook post. ‘And at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it,’ he writes. ‘Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. […] In hindsight, the research benefits of the paper may not have justified all of this anxiety.’”
“Kramer adds that Facebook’s internal review practices have ‘come a long way’ since 2012, when the experiment was run.”
I wonder what a long way really means? Does it mean that they can run their experiments on Facebook users in a more secretive manner without anyone finding out? Does it mean that Facebook can use this information to sway opinions on such things as politics gun control, abortion, immigration and other important national issues? We’re already seen plenty of evidence of their liberal views.
It reeks of George Orwell’s Big Brother where everyone was constantly bombarded with the government’s ‘Newspeak’ messages. Orwell depicts a society where individualism and independent thinking are ‘thoughtcrimes’ that were severely punished by the ruling Inner Party.
Facebook’s experiment only reinforces the use of Orwellian concepts. I also believe that the liberal main stream media have been using emotional contagion, for why else would so many Americans be duped into voting for Obama, Reid, Pelosi, and other flaming liberal Democrats.