Facebook News Feed experiment isn't all bad: A look at the other side of the controversy

There are some reports circulating the web trying to point out the other, and if we may so, 'brighter side' of the controversy


In the past of couple of days, we've seen activists and data regulators up in arms against Facebook conducting its News feed experiment on the grounds that it was grossly unethical.

 

The study which is now making headlines happened in 2012, when Facebook fiddled with the News Feeds of around 70,000 users by altering the content on their News Feeds to check how positive or negative content affected their emotions.  The study was published in the 17 June edition of the Proceedings of the National Academy of Sciences.

 

As soon as the study was made public, the social network came in for considerable criticism. Though, the terms and conditions of the social network explicitly say that it can use users' data, it should be noted that the study was conducted way before the social site tweaked its terms of service agreement.

 

 

And while this does deserve people's ire and what Facebook did is downright unethical is there more to this than meets the eye?  There are some reports circulating the web trying to point out the other, and if we may so, 'brighter side' of the controversy.

 

One such argument, is that with an increasing number of people accessing Facebook, such studies are in fact, a great way to provide insights about if and how, the network affects human behaviour. NYTimes Fahad Manjoo writes, "Most web companies perform extensive experiments on users for product testing and other business purposes, but Facebook, to its credit, has been unusually forward in teaming with academics interested in researching questions that aren’t immediately pertinent to Facebook’s own business."

 

"It is only by understanding the power of social media that we can begin to defend against its worst potential abuses. Facebook’s latest study proved it can influence people’s emotional states; aren’t you glad you know that? Critics who have long argued that Facebook is too powerful and that it needs to be regulated or monitored can now point to Facebook’s own study as evidence," he adds.

 

Now the whole deal is 'have people stopped using Facebook after the controversy', and the answer is 'No'. The current terms of Facebook mean you give your data to Facebook as soon as you click on that submit button. Will this stop you from using Facebook, now? "Well, get over it Facebook users. If you are a Facebook user, you willingly give Facebook every bit of data it has about you. Facebook’s data tests are not new. Facebook regularly manipulates what you see. It changes its hugely complex algorithm to show you less posts from people you don’t interact with often," states a report by ZDnet.

 

Another report suggests that Facebook hasn't gone beyond the laws. It states, " The PNAS article claims that Facebook’s Data Usage Policy acts as informed consent. And Facebook’s Data Usage Policy now reads, “for internal operations, including troubleshooting, data analysis, testing, research and service improvement.” This means that Facebook is well within the bounds of its own terms of service and data usage policy, meaning Facebook didn't do anything technically wrong." However, this doesn't mean it hasn't violated 'informed consent.' But informed consent isn't required by law.

The report further states how Facebook is in the 'emotions business' and it simply tried to make its product better. "Removing posts that had positive or negative words, reduced the words produced in status updates. If people don’t update their statuses, Facebook has no product. If you have a positive experience on Facebook you are likely to return. If you have a negative experience, you are less likely to return," the report adds.

 

Another aspect worth noting is Facebook has been open about its study, and that's the reason why it took to publishing it. Now, several companies are continuuously carrying out studies for their products, and with the response Facebook's study got, we may see fewer studies showing the impact of social media on human behaviour or likewise.

 

"Wouldn’t you also be interested in what other tech companies know about us? How does Google’s personalized search algorithm reinforce people’s biases? How does Netflix’s design shape the kinds of TV shows we watch? How does race affect how people navigate dating sites? After the outcry against the Facebook research, we may see fewer of these studies. That would be a shame," added Manjoo.


Find latest and upcoming tech gadgets online on Tech2 Gadgets. Get technology news, gadgets reviews & ratings. Popular gadgets including laptop, tablet and mobile specifications, features, prices, comparison.