Exeter Computing Club

Penny, our programming lead, presented to Exeter Computing Club about the roles of historic racism and sexism in machine learning. Because of the history of racism and sexism in determining candidates for jobs and universities, and decisions made by companies and banks about areas with a larger demographic of minorities, algorithms that then learn from these historical examples develop racist and sexist tendencies as well. This has happened much too often – Amazon’s algorithm for hiring job applicants chose male over female candidates, self-driving cars have a harder time identifying pedestrians with darker skin, and Google’s algorithm for identifying human faces had a much higher success rate with white men.

This also speaks to the underlying problems that can happen with training an algorithm. There can either be too little or too many data points. Again, this speaks to the racism and sexism in the tech industry – the Google algorithm was mainly trained with the faces of the people working on it, namely, white men. This raises many implications for the future, when AI will be used more widely for tasks such as hiring job applicants, making decisions about different demographics, and identifying pedestrians. As a team, we hope to raise awareness about this problem, and encourage a solution.

Leave a Reply

Your email address will not be published. Required fields are marked *