When big data goes bad in a totally predictable way
1. When an unemployed black woman pretends to be white her job offers skyrocket (Urban Intellectuals, h/t Mike Loukides). Excerpt from the article: “Two years ago, I noticed that Monster.com had added a “diversity questionnaire” to the site. This gives an applicant the opportunity to identify their sex and race to potential employers. Monster.com guarantees that this “option” will not jeopardize your chances of gaining employment. You must answer this questionnaire in order to apply to a posted position—it cannot be skipped. At times, I would mark off that I was a Black female, but then I thought, this might be hurting my chances of getting employed, so I started selecting the “decline to identify” option instead. That still had no effect on my getting a job. So I decided to try an experiment: I created a fake job applicant and called her Bianca White.”
2. How big data could identify the next felon – or blame the wrong guy (Bloomberg). From the article: “The use of physical characteristics such as hair, eye and skin color to predict future crimes would raise ‘giant red privacy flags’ since they are a proxy for race and could reinforce discriminatory practices in hiring, lending or law enforcement, said Chi Chi Wu, staff attorney at the National Consumer Law Center.”
3. How algorithms magnify misbehavior (the Guardian, h/t Suresh Naidu). From the article: “For one British university, what began as a time-saving exercise ended in disgrace when a computer model set up to streamline its admissions process exposed – and then exacerbated - gender and racial discrimination.”
This is just the beginning, unfortunately.