Facebook's News Feed experiment: This is not 'informed consent', say users

Facebook's 'emotional' experiment where the company altered the content on the News Feeds of close to 700,000 users has not gone down well with its users.

Facebook's 'emotional' experiment where the company altered the content on the News Feeds of close to 700,000 users to check how positive or negative content affected their emotions, has not gone down well with its users.

 

The study, conducted by researchers affiliated with Facebook, Cornell University, and the University of California at San Francisco, appeared in the June 17 edition of the Proceedings of the National Academy of Sciences. Facebook's Data scientist Adam Kramer has tried to put out an explanation saying that the study was conducted because they cared about the emotional impact of Facebook on users and rather it confirmed that when say users see more positive content from other users ( such as stories about their friends' success or holiday) they too are likely to feel more positive, instead of the conventional wisdom that they would be jealous.

 

He also says that he understands the concerns people have had with the study, but "In hindsight, the research benefits of the paper may not have justified all of this anxiety." And while the benefits might have been great, users in general aren't too thrilled with the thought of being used as lab rats by Facebook.

 

On Twitter users are outraged by the study and some have asked whether they are lab rats for the company. Others pointed out that this is not informed consent and that the ethics of the study's methodology were questionable.

 

Below are some Twitter reactions.

 

 

 

 

 

 

 

 

 

 

On Facebook people also pointed out that while the motives of the company were fine, it should have considered the impact of such as study on mentally depressed people. One user wrote on Kramer's statement, "I appreciate your making this statement, and I just hope that Facebook as a corporation will provide financial support to those who struggle with mental health issues...But this terrified me simply because there are millions of people in this country dealing with depression every day."

 

Another user pointed out, "I think the entire Internet/media/online presence is an experiment in human behavior. we have no idea what it's doing to people- we are just starting to get it. it's not very pretty. however, while I don't hold corporations responsible for my part in it, I do feel manipulating my exposure and decreasing contact with people I *thought I'd purposefully included in my realm of contacts seems shady. I disagree with the power to manipulate emotions/ my mood or responses to the world are my choice, but FB deciding what world I see is bogus. IMHO. I don't watch murder movies for a reason & I block constant negativity as often as possible. I don't need someone else making that choice for me."

 

Yet others pointed out that if you were a part of Facebook, you had to be prepared for such stuff as legal and ethical weren't always the same thing. One user wrote, "You don't want to be an experiment like this? Don't use Facebook. You agreed when you signed up."

 

Tech2 is now on WhatsApp. For all the buzz on the latest tech and science, sign up for our WhatsApp services. Just go to Tech2.com/Whatsapp and hit the Subscribe button.





Top Stories


also see

science