Thanks for a great case study, Facebook!
I’m super excited about the recent “mood study” that was done on Facebook. It constitutes a great case study on data experimentation that I’ll use for my Lede Program class when it starts mid-July. It was first brought to my attention by one of my Lede Program students, Timothy Sandoval.
My friend Ernest Davis at NYU has a page of handy links to big data articles, and at the bottom (for now) there are a bunch of links about this experiment. For example, this one by Zeynep Tufekci does a great job outlining the issues, and this one by John Grohol burrows into the research methods. Oh, and here’s the original research article that’s upset everyone.
It’s got everything a case study should have: ethical dilemmas, questionable methodology, sociological implications, and questionable claims, not to mention a whole bunch of media attention and dissection.
By the way, if I sound gleeful, it’s partly because I know this kind of experiment happens on a daily basis at a place like Facebook or Google. What’s special about this experiment isn’t that it happened, but that we get to see the data. And the response to the critiques might be, sadly, that we never get another chance like this, so we have to grab the opportunity while we can.
Here’s another hypothesis this study might support: the customization of news feeds has contributed to the polarization of politics in our country. There was a time when there were only three major news sources available to people on a daily basis and the news they provided was governed by a fairness doctrine. The segmentation that began with cable TV has increased with the internet making it possible for people to get, for example, “Christian News”. This segmentation leads to a situation where one’s world view is constantly reinforced making it harder for open-mindedness to prevail.
LikeLike
Spot on. This started bothering me a year or more ago when one day I realized I very rarely got anything in my various streams that wasn’t either self-filtered or automatically filtered by what Google thought I wanted to see. I suspected it in my search results as over time it just “feels” like when I search for something that is politically active, the sources returned tend to be those I tend to agree with. I specifically follow several people on Google+ now because they are on the other side of some fences that are important to me. But I still don’t know if I get enough opposing viewpoint in my day and seriously doubt very many people will go to as much trouble to attempt to rectify this in their own daily firehose of information. Deeper polarization seems inevitable when sources feel a need to please the viewer to keep eyes on advertising.
LikeLike
I’m glad you jumped in here, I think I squashed everyone on Twitter with the press this got as I felt it was just some over zealous data scientist wanting some attention and the fact that Facebook has good marketers too:) Of course they want their name out there in the news all the time so people don’t leave them no matter what it is so they cooked this up from 2 years ago to call it “science” for some press.
It’s just some guy or gal working with data like you said goes on every day and people just sucked it up and had to create some weird view of what they felt it meant..like you said same old stuff that goes on every day with playing with algorithms looking for value. I thought about it in a couple ways of either being Facebook marketing driven or the data scientist, maybe a little full of himself or maybe both:) By the time I got on Twitter yesterday I saw all kinds of chats going on with echoes of conspiracy theories and you name it and it is scary to the average consumer who doesn’t understand coding and algorithms and thus it went wild.
Agree it was a good lesson on the power of what I call “Algo Duping” and how it gets all of us at times too, but this was a really big duped and maybe the press learned something from this too. Agree on the value of getting to see the data as we get a glimpse of some of what they are doing as well, and of course this controlled as to what they “want” to show us. The way the news carried on though with it and exaggerating their views they made everyone feel insecure with it by adding the big OMG dazzle:) Calling it “science” though did crack me up a bit as you say it’s just every day work with looking to create or find value.
LikeLike
I have long suspected, er, hypothesized that Aunt Pythia is some kind of mood-altering Social Media Experiment (SME) — so when do we get to see the under-the-table tabulations on that?
LikeLike
I find no dilemma in the ethics of the Facebook study. I find the study completely unethical. Without explicit authorization by the study subject, which Facebook did not do, I believe the Facebook usage agreement, and others, is so broadly worded that they show how unethical they really are.
LikeLike
Hi Cathy,
sadly, you’re probably very right about this kind of experiments taking place on daily basis. I personally (scientific point of view aside) think that 90% of internet users have no clue how Facebook or Google filtering works. They are probably not even aware of their information streams being filtered. The remaining (possibly less than) 10% users chose to accept this.
Now, a big plus for those 90% will hopefully be that they will become aware of filtering and risk of manipulation (be it social experiments or ‘mere’ advertising) and might become less willing to share too much information and care more about their privacy.
What do you think?
LikeLike