Google’s mistake with Gemini
You have probably heard that Google had to suspend it’s Gemini image feature after showing people black Nazis and female popes:
Well I have a simple explanation for what happened here. Namely, the folks at Google wanted to avoid an embarrassment that they’d been involved with multiple times, and seen others get involved with, namely the “pale male” dataset problem, that happens especially at tech companies dominated by white men, and ironically, especially especially at tech companies dominated by white men who are careful about privacy, because then they only collect pictures of people who give consent, which is typically people who work there! See for example this webpage, or Safiya Noble’s entire book.
So, in order to avoid this embarrassment, they scaled up a notion of diversity. Which is to say, they stuck diversity into everything, at scale, even if it made no historical sense.
In other words, the real mistake they made is to think that equity or fairness is something that can be done at scale.
In reality equity and fairness are narrowly defined, contextual notions. When we decide it’s fair to use a FICO score in order to determine an interest rate on a loan, that’s very different from using a FICO score to decide how many weeks of unemployment insurance you should receive after breaking your leg. You cannot decide that “FICO scores are legitimate discriminators” as a universal rule, just as “diverse skin tones and genders” is not a universal good, especially when “diverse” is not even a well-defined notion unless you specify a geographic area or culture.
This mistake that Google made was not a coincidence, by the way. It’s a result of a combination of laziness (as in, they just didn’t think very hard about this) and capitalism (as in, it would be expensive to do this right).
I thought this (error) might lead to some interesting thought experiments. The user would need to know or investigate the reason the results they got were incorrect. Could be illuminating. At the very lest it would require people to be less immediately accepting of generated results.
LikeLike
A private comment to your personally: I think showing a female pope was not so bad. Hopefully it will happen someday.
Christina
LikeLike
There actually was a female Pope once upon a time…didn’t catch on, though,
LikeLike
The most ironic thing here is the black n4z1 guy… When it’s to be portrayed as a villain, AI can’t show a white guy
LikeLike