Google apologizes for ‘missing the mark’ after Gemini generated racially diverse Nazis::Google says it’s aware of historically inaccurate results for its Gemini AI image generator, following criticism that it depicted historically white groups as people of color.

  • kromem@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    4 months ago

    Except these kinds of data driven biases can creep in from all sorts of ways.

    Is there a bias in what images have labels and what don’t? Did they focus only on English labeling? Did they use a vision based model to add synthetic labels to unlabeled images, and if so did the labeling model introduce biases?

    Just because the sampling is broad doesn’t mean the processes involved don’t introduce procedural bias distinct from social biases.