Black Listed News

Facebook’s Secret Mood Manipulation Experiment

June 29, 2014
Share It | Print This

Source: David Holmes, Pando Daily

sad-smartphone

If you were still unsure how much contempt Facebook has for its users, this will make everything hideously clear.

In a report published at the Proceedings of the National Academy of Sciences (PNAS), Facebook data scientists conducted an experiment to manipulate the emotions of nearly 700,000 users to see if positive or negative emotions are as contagious on social networks as they are in the real world. By tweaking Facebook’s powerful News Feed algorithm, some users (we should probably just call them “lab rats” at this point) were shown fewer posts with positive words. Others saw fewer posts with negative words. “When positive expressions were reduced,” the paper states, “people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”

The results shouldn’t surprise anybody. What’s more surprising, and unsettling, is the power Facebook wields in shifting its users’ emotional states, and its willingness to use that power on unknowing participants. First off, when is it okay to conduct a social behavior experiment on people without telling them? Technically, and as the paper states, users provided the consent for this research when they agreed to Facebook’s Data Use Policy prior to signing up, so what Facebook did isn’t illegal. But it’s certainly unethical.

Furthermore, manipulating user emotions in a digital space comes with uniquely disturbing consequences. In the real world, if you feel like the people around you bring too much negativity into your life, the solution is easy: Find a new crowd. But on Facebook, short of canceling your account, this is impossible to do if the company suddenly decides, whether as part of a research study or at the behest of certain advertising or engagement interests, to start sending more negative content your way. The whole point of the News Feed algorithm, to hear Facebook tell it, is to give users an experience tailored to their wants and interests. Clearly, that objective falls by the wayside anytime Facebook wants to turn its user base into a science experiment.

And then there’s the tone deaf gall of the whole thing: This research wasn’t uncovered by an investigative reporter, Facebook submitted the research to PNAS themselves. To make matters worse, there are questions about whether the methodology used was even sound. To determine “positive” and “negative” sentiments, the researchers used a technique called “Linguistic Inquiry and Word Count” or LIWC. But even the creators of LIWC admit that assessing its validity when applied to “natural language” (like a Facebook update) is “tricky.” LIWC’s reliability has largely been tested by analyzing essays, where there is more repetition than in natural language.

Read More...

Share This Article...


Subscribe To Updates
Blacklisted Newsletter
Blacklisted Radio
Blacklisted Nation
On Twitter
On Facebook
The DoomCast
Podcasts on Demand
Podcasts on Spreaker
Podcasts on Youtube
Podcasts on iTunes
Podcasts on Stitcher
Podcasts on Tunein
Podcasts on Roku

Support Us
Donate Today!

Affiliates
Golden Eagle Coins
6 Dollar T-Shirts
DHGATE.COM
GoldSilver.com
The Ready Store
Onnit Labs
Audible Audio Books
Amazon.com
Blue Host



BlackListed News CC 2006-2014