Home > data science, modeling, rant > The private-data-for-services trade fallacy

The private-data-for-services trade fallacy

November 1, 2013

I had a great time at Harvard Wednesday giving my talk (prezi here) about modeling challenges. The audience was fantastic and truly interdisciplinary, and they pushed back and challenged me in a great way. I’m glad I went and I’m glad Tess Wise invited me.

One issue that came up is something I want to talk about today, because I hear it all the time and it’s really starting to bug me.

Namely, the fallacy that people, especially young people, are “happy to give away their private data in order to get the services they love on the internet”. The actual quote came from the IBM guy on the congressional subcommittee panel on big data, which I blogged about here (point #7), but I’ve started to hear that reasoning more and more often from people who insist on side-stepping the issue of data privacy regulation.

Here’s the thing. It’s not that people don’t click “yes” on those privacy forms. They do click yes, and I acknowledge that. The real problem is that people generally have no clue what it is they’re trading.

In other words, this idea of a omniscient market participant with perfect information making a well-informed trade, which we’ve already seen is not the case in the actual market, is doubly or triply not the case when you think about young people giving away private data for the sake of a phone app.

Just to be clear about what these market participants don’t know, I’ll make a short list:

  • They probably don’t know that their data is aggregated, bought, and sold by Acxiom, which they’ve probably never heard of.
  • They probably don’t know that Facebook and other social media companies sell stuff about them even if their friends don’t see it and even though it’s often “de-identified”. Think about this next time you sign up for a service like “Bang With Friends,” which works through Facebook.
  • They probably don’t know how good algorithms are getting at identifying de-identified information.
  • They probably don’t know how this kind of information is used by companies to profile users who ask for credit or try to get a job.

Conclusion: people are ignorant of what they’re giving away to play Candy Crush Saga[1]. And whatever it is they’re giving away, it’s something way far in the future that they’re not worried about right now. In any case it’s not a fair trade by any means, and we should stop referring to it as such.

What is it instead? I’d say it’s a trick. A trick which plays on our own impulses and short-sightedness and possibly even a kind of addiction to shiny toys in the form of candy. If you give me your future, I’ll give you a shiny toy to play with right now. People who click “yes” are not signaling that they’ve thought deeply about the consequences of giving their data away, and they are certainly not making the definitive political statement that we don’t need privacy regulation.

1. I actually don’t know the data privacy rules for Candy Crush and can’t seem to find them, for example here. Please tell me if you know what they are.

Categories: data science, modeling, rant
  1. Joshua
    November 1, 2013 at 7:47 am

    I strongly agree with this and it is a powerful argument against libertarianism. People often don’t know what they want. When they think they know, they often don’t know if it is good for them. When they know what they want and think it is good for them, the often aren’t able to take the required action. At each stage, an external party with the right information and (often simple) tactics can powerfully influence the outcome.

    But . . .

    (1) Is there anyone else who knows better than the individual what they want?
    (2) Who knows what is the good/right/best action or decision?
    (3) Who should be authorized to guide/force/dictate/influence action?

    As far as I can tell, any single or consistent set of answers to those questions have their own problems. What could work (and is, perhaps, close to what we have) is a set of strategies that can be flexibly deployed for different scenarios, with competition (between strategies) and analysis of empirical outcomes to identify successes/mistakes.

    (1) Who knows what the individual wants:
    – Usually the individual in question is best placed
    – Almost always they need the right information
    – Often they need the information to be well presented
    – Sometimes (often, always?) they need some training/education

    (2) Who knows what is good?
    – Usually no clear choice with some benefits and costs that involve difficult comparisons (which usually require moderate training to understand).
    – Often depends on what “good” means in this case and different parties will have different incentives
    – Often (usually?) not the individual
    – Sometimes an expert
    – Sometimes no one?

    (3) Who can guide/force/dictate/influence?
    I don’t know. It seems to range from obvious examples where no one should be allowed to dictate or force a particular choice to much more subtle questions where design decisions make a difference.

    Finally, a point I’d long rejected (forcefully) but which has increasing appeal as I get older (because of my own children?): maybe family and the local community are often the right compromise agent. They can be close enough to understand what the individual wants, but have access to a useful set of information/experience, and can be insulated from many cognitive decision biases. Of course . . . they can get all of these points badly wrong, too!


  2. Guest2
    November 1, 2013 at 9:22 am

    Great read and cautionary screed! I wish everyone understood this ….


  3. Scott S
    November 1, 2013 at 9:27 am

    Totally with you on this…. it is a bit like the nuke arms race… so what, stable nation has nukes… stable today that is, But what about a change of government/ party/ or the coup of tomorrow… that’s the unknowable.
    All companies will pledge security of identity and possibly at first that is what they adhere to, but then along comes “Change of policy notification” that no one reads… [ if they bother to send it ] and bingo! your id info is now someone else’s property.


  4. November 1, 2013 at 10:02 am

    I also strongly agree but I take exception with Joshua’s statements that ‘powerful argument against libertarianism’. I would rather think this is a strong argument against asymmetrical power and asymmetrical information. The individual is not being told / educated about what a company can and will do with any and all information given. And individual privacy is nonexistent in this country (compared to Europe) while corporate opacity is rampant.

    Actually liberals and libertarians are not very far apart on most issues.


  5. November 1, 2013 at 12:17 pm

    In fairness, I would be much quicker to give away my data if companies were better at targeting me. For example, if I could figure out how to tell Pandora that I have a Ph.D. and therefore don’t need ads for completing my degree at for-profit institutions, I’d do that in a heartbeat.


  6. November 1, 2013 at 1:10 pm

    Ever since Netscape (I think) invented the cookie in the mid-90s, we’ve been blindly giving up our data for immediate gratification.
    In the early colonial days, Europeans dangled glass beads (OK I’m oversimplifying) in the faces of the original inhabitants in exchange for the land and its riches. Today around the world that transferred “ownership” of is still being hotly contested.
    Right now Google and just about everyone else, to secure their claims on the vast fields of personal data across the new cyber world, are circling the wagons and making extensive and calculated updates to their privacy policies.
    These are the new “treaties” outlining what we, the public, owners of our personal data, are expected to give up for the digital trinkets and playthings we receive.
    Will we have to wait two or three hundred years before we collectively awake to what we’ve lost or is it already too late?


  7. Horace Boothroyd III
    November 5, 2013 at 3:11 am

    You are precisely correct, and this is why I have such a short temper with people running around squealing NSA SURVEILLANCE IS THE WORST THING EVER!!!!!! The NSA wouldn’t know sheet if these people had not turned over all their private data to the corporations in the first place.

    Know your enemy, people, and take measures to protect yourselves.


    • November 5, 2013 at 9:47 am

      Horace, it might be more productive if you redirect your anger toward the NSA, Google and all the other internet companies (dare I say “predators”?) who’ve known about the data harvesting possibilities pretty well from the get-go. It all may have started as simplistic efforts to monetize digital assets and do market research on users, but it has morphed into a massive invasion of privacy while our naivety and bedazzlement at this brave new world on the surface led us deeper and deeper into the vast and ever expanding swamp of data underneath. For example, the term “metadata” has only recently entered the general discourse and many still don’t understand how even anonymous surfing, texting and phonecalling can lead directly to us by the most Byzantine pathways.
      We are squealing more concertedly about all this these days because we’ve been properly alerted to the seriousness of our situation only relatively recently thanks to Assange, Snowden and others. I don’t remember any official of the government, the NSA or the internet industry ever alerting us to the full extent of the dangers of our online activities. In fact, they have aggressively suppressed or ignored the small group of whistleblowers, like William Binney and Thomas Drake, who valiantly tried to alert us to these very problems as they were developing over the last decade.
      The genie is definitely out of the bottle and for most of us our bits of data are littered all over the internet waiting for someone or some agency to activate the algorithm that will gather them together and define us in ways we still cannot imagine. It’s a pretty good reason to scream, I’d say.


  1. No trackbacks yet.
Comments are closed.
%d bloggers like this: