Considering that I’ve been harping on about the importance of privacy and ethics in bigdata analytics, this recent news about Facebook just reinforces why data scientists need to get a conscience and social media Users need to recognize that they are being manipulated.
Some more of my ramblings on this topic:
- Why Privacy shouldn’t be considered an impediment to innovation, but an opportunity to innovate. (June 2014)
- Privacy By Design; the only way to go! (May 2014)
- It’s time for Data Scientists to prioritize Privacy & Ethics above all else! (December 2013)
- C’mon guys, let’s get serious about Privacy! (July 2013)
- The Hippocratic Oath for the Data Scientist (July 2013)
- Is PRIVACY the Software Industry’s SOPA? (February 2012)
- Is privacy dead, or merely snoozing? (September 2011)
Originally posted on PandoDaily:
If you were still unsure how much contempt Facebook has for its users, this will make everything hideously clear.
In a report published at the Proceedings of the National Academy of Sciences (PNAS), Facebook data scientists conducted an experiment to manipulate the emotions of nearly 700,000 users to see if positive or negative emotions are as contagious on social networks as they are in the real world. By tweaking Facebook’s powerful News Feed algorithm, some users (we should probably just call them “lab rats” at this point) were shown fewer posts with positive words. Others saw fewer posts with negative words. “When positive expressions were reduced,” the paper states, “people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”
The results shouldn’t surprise anybody. What’s more surprising…
View original 613 more words