Programmed bias

By Di on September 26 2019
Topical

Human-built machines immortalize human problems, as we are discovering more and more. Voice recognition software isn’t good at identifying higher-pitched (i.e., predominantly women’s) voices. Facial recognition software is far superior at identifying white men’s faces than literally anyone else’s. Motion sensors often seem to be unable to detect dark skin, a problem that seems to also infect some wearable health monitors.

Perhaps the most famous example of this is when Amazon wrote AI software to sort through resumes to identify top applicants. Because of how Amazon had recruited and hired over the previous ten years—the base dataset that the AI used to train itself—the software penalized any mention of “women’s” and disregarded candidates from women’s colleges. It was basing its definition of an optimal candidate on human hiring decisions, and since tech is so dominated by men, that definition assumed the optimal candidate would be as well.

It’s all really one problem: when you don’t hire diverse candidates, tools will assume a certain kind of person is preferred (or at least the default), and perpetuate that thoughtlessly for as long as they’re designed to.
 

Teacher Rating
0
No votes yet
Discussion
How can the white male bias in the world's tech companies be changed?
What about politics—why are women still underrepresented in business and politics?
Describe a female role model, not including a relative, you've had in your life.