Who is Dr John WorldPeace?

 

COMMENTARY BY DR JOHN WORLDPEACE FOR PRESIDENT 2016

It is arrogant and egregious for Mark Zuckerberg to treat his members with such condescending contempt.

Dr John WorldPeace

150101 0800 


SOURCE: FORBES

Facebook Manipulated 689,003 Users' Emotions For Science

On Facebook, you may be a guinea pig and not know it.

June 29: Updated with statement from Facebook, from the author of the study, and from the editor of the academic journal that published the study.

Facebook is the best human research lab ever. There’s no need to get experiment participants to sign pesky consent forms as they’ve already agreed to the site’s data use policy. A team of Facebook data scientists are constantly coming up with new ways to study human behavior through the social network. When the team releases papers about what it’s learned from us, we often learn surprising things about Facebook instead — such as the fact that it can keep track of the status updates we never actually post. Facebook has played around with manipulating people before — getting 60,000 to rock the vote in 2010 that theoretically wouldn’t have otherwise — but a recent study shows Facebook playing a whole new level of mind gamery with its guinea pigs users. As first noted by The New Scientist and Animal New York, Facebook’s data scientists manipulated the News Feeds of 689,003 users, removing either all of the positive posts or all of the negative posts to see how it affected their moods. If there was a week in January 2012 where you were only seeing photos of dead dogs or incredibly cute babies, you may have been part of the study. Now that the experiment is public, people’s mood about the study itself would best be described as “disturbed.”

The researchers, led by data scientist Adam Kramer, found that emotions were contagious. “When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred,” according to the paper published by the Facebook research team in the PNAS. “These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”

The experiment ran for a week — January 11–18, 2012 — during which the hundreds of thousands of Facebook users unknowingly participating may have felt either happier or more depressed than usual, as they saw either more of their friends posting ’15 Photos That Restore Our Faith In Humanity’ articles or despondent status updates about losing jobs, getting screwed over by X airline, and already failing to live up to New Year’s resolutions. “*Probably* nobody was driven to suicide,” tweeted one professor linking to the study, adding a “#jokingnotjoking” hashtag.

The researchers — who may not have been thinking about the optics of a “Facebook emotionally manipulates users” study — jauntily note that the study undermines people who claim that looking at our friends’ good lives on Facebook makes us feel depressed. “The fact that people were more emotionally positive in response to positive emotion updates from their friends stands in contrast to theories that suggest viewing positive posts by friends on Facebook may somehow affect us negatively,” they write.

They also note that when they took all of the emotional posts out of a person’s News Feed, that person became “less expressive,” i.e. wrote fewer status updates. So prepare to have Facebook curate your feed with the most emotional of your friends’ posts if they feel you’re not posting often enough.

So is it okay for Facebook to play mind games with us for science? It’s a cool finding but manipulating unknowing users’ emotional states to get there puts Facebook’s big toe on that creepy line. Facebook’s data use policy — that I’m sure you’ve all read — says Facebookers’ information will be used “for internal operations, including troubleshooting, data analysis, testing, research and service improvement,” making all users potential experiment subjects. And users know that Facebook’s mysterious algorithms control what they see in their News Feed. But it may come as a surprise to users to see those two things combined like this. When universities conduct studies on people, they have to run them by an ethics board first to get approval — ethics boards that were created because scientists were getting too creepy in their experiments, getting subjects to think they were shocking someone to death in order to study obedience and letting men live with syphilis for study purposes. A 2012 profile of the Facebook data team noted, “Unlike academic social scientists, Facebook’s employees have a short path from an idea to an experiment on hundreds of millions of people.” (Update 6/30/14): Cornell University released a statement Monday morning saying its ethics board — which is supposed to approve any research on human subjects — passed on reviewing the study because the part involving actual humans was done by Facebook not by the Cornell researcher involved in the study. Though the academic researchers did help design the study — as noted when it was published — so this seems a bit disingenuous.

In its initial response to the controversy around the study — a statement sent to me late Saturday night — Facebook doesn’t seem to really get what people are upset about, focusing on privacy and data use rather than the ethics of emotional manipulation and whether Facebook’s TOS lives up to the definition of “informed consent” usually required for academic studies like this. “This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account,” says a Facebook spokesperson. “We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.”

Ideally, Facebook would have a consent process for willing study participants: a box to check somewhere saying you’re okay with being subjected to the occasional random psychological experiment that Facebook’s data team cooks up in the name of science. As opposed to the commonplace psychological manipulation cooked up by advertisers trying to sell you stuff.