Programmed bias

The English Farm | Golden Week Sale - Get 10% OFF

Human-built machines immortalize human problems, as we are discovering more and more. Voice recognition software isn’t good at identifying higher-pitched (i.e., predominantly women’s) voices. Facial recognition software is far superior at identifying white men’s faces than literally anyone else’s. Motion sensors often seem to be unable to detect dark skin, a problem that seems to also infect some wearable health monitors.

Perhaps the most famous example of this is when Amazon wrote AI software to sort through resumes to identify top applicants. Because of how Amazon had recruited and hired over the previous ten years—the base dataset that the AI used to train itself—the software penalized any mention of “women’s” and disregarded candidates from women’s colleges. It was basing its definition of an optimal candidate on human hiring decisions, and since tech is so dominated by men, that definition assumed the optimal candidate would be as well.

It’s all really one problem: when you don’t hire diverse candidates, tools will assume a certain kind of person is preferred (or at least the default), and perpetuate that thoughtlessly for as long as they’re designed to.
 

Discussion: 
How can the white male bias in the world's tech companies be changed? Do Homework
What about politics—why are women still underrepresented in business and politics? Do Homework
Describe a female role model, not including a relative, you've had in your life. Do Homework