June 29, 2014

Facebook’s science experiment on users shows the company is even more powerful and unethical than we thought

Marie Wallace:

Considering that I’ve been harping on about the importance of privacy and ethics in bigdata analytics, this recent news about Facebook just reinforces why data scientists need to get a conscience and social media Users need to recognize that they are being manipulated.

Some more of my ramblings on this topic:

Originally posted on PandoDaily:


If you were still unsure how much contempt Facebook has for its users, this will make everything hideously clear.

In a report published at the Proceedings of the National Academy of Sciences (PNAS), Facebook data scientists conducted an experiment to manipulate the emotions of nearly 700,000 users to see if positive or negative emotions are as contagious on social networks as they are in the real world. By tweaking Facebook’s powerful News Feed algorithm, some users (we should probably just call them “lab rats” at this point) were shown fewer posts with positive words. Others saw fewer posts with negative words. “When positive expressions were reduced,” the paper states, “people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”

The results shouldn’t surprise anybody. What’s more surprising…

View original 613 more words


Get every new post delivered to your Inbox.

Join 2,833 other followers

%d bloggers like this: