- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
an AI resume screener had been trained on CVs of employees already at the firm, giving people extra marks if they listed “baseball” or “basketball” – hobbies that were linked to more successful staff, often men. Those who mentioned “softball” – typically women – were downgraded.
Marginalised groups often “fall through the cracks, because they have different hobbies, they went to different schools”
The bias is really introduced at the design stage. Designers should be aware of demographic differences and incorporate that into the model to produce something more balanced. It’s far from impossible to design models that do not become biased in this way, even from biased data - although, that is no to say it’s easy.