Facebook explains why it tinkered the News Feeds of close to 700,000 users

Facebook's data scientist, Adam Kramer has written a post on the social networking site defending the study on News Feeds of users.


Facebook is facing a lot of heat from users after it transpired that the company had secretly manipulated the News Feed of some 700,000 users to study “emotional contagion.”

 

The study was conducted in 2012 along with Cornell University, and the University of California at San Francisco. It appeared in the 17 June edition of the Proceedings of the National Academy of Sciences. The researchers wanted to see if the number of positive, or negative, words in messages they read affected whether users then posted positive or negative content in their status updates. Indeed, after the exposure the manipulated users began to use negative or positive words in their updates depending on what they were exposed to.

 

Naturally the fact that Facebook conducted such a study without asking users for explicit permission has raised an outcry and serious questions about the ethical nature of the study. Now Facebook's data scientist, Adam Kramer has written a post on the social networking site defending the study, reports The Verge. You can read Kramer's full post here.

 

According to Kramer the reason they did the study was "because we care about the emotional impact of Facebook and the people that use our product."

 

He adds, "We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook. We didn't clearly state our motivations in the paper."

 

He goes on to add that for its methodology, the social networking site only "very minimally deprioritized a small percentage of content in the News Feed (based on whether there was an emotional word in the post) for a group of people (about 0.04% of users, or 1 in 2500) for a short period (one week, in early 2012)."

 

He also adds that the posts were not hidden but that they didn't show on some News Feeds. Of course what this means is that while they weren't hidden for everybody, they were hidden for the people whose News Feeds were selected for the experiment.

 

Kramer goes on to say the experiment showed that a certain kind of emotion, say positive or negative, "encourages rather than suppresses."

 

According to Kramer the impact of the study on people was minimal and the idea behind it was to give a better Facebook service to users.

 

He adds, "I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety. While we’ve always considered what research we do carefully, we (not just me, several other researchers at Facebook) have been working on improving our internal review practices."

 

 

 

 


Find latest and upcoming tech gadgets online on Tech2 Gadgets. Get technology news, gadgets reviews & ratings. Popular gadgets including laptop, tablet and mobile specifications, features, prices, comparison.