More creepy models
I’ve been having fun collecting creepy data-driven models for your enjoyment. My first installment was here, with additional models added by my dear commenters. I’ve got three doozies to add today.
- Girls Around Me. This is a way to find out if you know any girls in your immediate vicinity, which is perfect for my stalker friends, using Foursquare data. My favorite part is that the title of this article about it actually uses the word “creepy”.
- Zestcash is a cash lending, payday-like service that data mines their customers, with a stated APR of up to 350%. On of my favorite misleading quotes in this article about the model: “Better accuracy should translate into lower interest rates for consumers.” Ummmm… yeah for some of them. And I guess the idea goes, those other losers deserve what they get because they’re already poor?
- The creepiest of all by far (because it is so painfully scalable and I could imagine it being everywhere in 2 years) is this one which proposes to embody the “best practices” of medicine into a data science model. Look, we desperately need a good model in health and medicine to do some of the leg-work for us, namely come up with a list of reasonable treatments that your doctor can take a look at and discuss with you. But we definitely don’t need a model which comes up with that list and then decides for you and your doctor which one you should undergo. Decisions like that, which often come down to things like how we care about quality of life, cannot and should not be turned into an algorithm.
By the way, just to balance the creepy models a bit, I also wanted to mention a few cool ideas:
- What about having a Reckoner General, like a surgeon general? That person could answer basic questions to explain how models are being used and to head off creepy models. Proposed by my pal Jordan Ellenberg.
- What about having an F.D.A.-like regulator for financial products? They would be in charge of testing the social utility of a proposed instrument class before it went to market. Can we do the same for data-driven models? Can the regulator be kick-ass and reasonably funded?
- What about having a creep model auditing board that brings together a bunch of nerds from technology and ethics and looks through the new models and formally reprimands creepiness, using the power of social pressure to fend it off? They could publicize a list of creeps who made these models to call people out and shame them. That really doesn’t happen enough, it’s like the modelers are invisible.
- How about a law that, if you add a cookie to your digital signature that says, “don’t track me for reals”, then if you find someone tracking you, as in saving and selling your information, you can sue for $100K in damages and win?