Making AI Facial Recognition Less Racist

AI has famously been rather poor at recognizing faces in a non-racist way. The size of the challenge was highlighted by recent work from MIT and Stanford University, which found that three commercially available facial-analysis programs displayed considerable biases against both gender and skin-types.

For instance, the programs were nearly always accurate in determining the gender of light-skinned men but had an error rate of over 34 percent when it came to darker-skinned women.