Home > Uncategorized > Google’s mistake with Gemini

Google’s mistake with Gemini

March 12, 2024

You have probably heard that Google had to suspend it’s Gemini image feature after showing people black Nazis and female popes:

Well I have a simple explanation for what happened here. Namely, the folks at Google wanted to avoid an embarrassment that they’d been involved with multiple times, and seen others get involved with, namely the “pale male” dataset problem, that happens especially at tech companies dominated by white men, and ironically, especially especially at tech companies dominated by white men who are careful about privacy, because then they only collect pictures of people who give consent, which is typically people who work there! See for example this webpage, or Safiya Noble’s entire book.

So, in order to avoid this embarrassment, they scaled up a notion of diversity. Which is to say, they stuck diversity into everything, at scale, even if it made no historical sense.

In other words, the real mistake they made is to think that equity or fairness is something that can be done at scale.

In reality equity and fairness are narrowly defined, contextual notions. When we decide it’s fair to use a FICO score in order to determine an interest rate on a loan, that’s very different from using a FICO score to decide how many weeks of unemployment insurance you should receive after breaking your leg. You cannot decide that “FICO scores are legitimate discriminators” as a universal rule, just as “diverse skin tones and genders” is not a universal good, especially when “diverse” is not even a well-defined notion unless you specify a geographic area or culture.

This mistake that Google made was not a coincidence, by the way. It’s a result of a combination of laziness (as in, they just didn’t think very hard about this) and capitalism (as in, it would be expensive to do this right).

Categories: Uncategorized
  1. Shelly Webster
    March 12, 2024 at 10:40 am

    I thought this (error) might lead to some interesting thought experiments. The user would need to know or investigate the reason the results they got were incorrect. Could be illuminating. At the very lest it would require people to be less immediately accepting of generated results.

    Like

  2. Prof. C. Sormani
    March 12, 2024 at 4:01 pm

    A private comment to your personally: I think showing a female pope was not so bad. Hopefully it will happen someday.

    Christina

    Like

    • March 13, 2024 at 12:55 am

      There actually was a female Pope once upon a time…didn’t catch on, though,

      Like

  3. March 13, 2024 at 8:01 am

    The most ironic thing here is the black n4z1 guy… When it’s to be portrayed as a villain, AI can’t show a white guy

    Like

  1. March 24, 2024 at 12:10 pm
Comments are closed.