Facial recognition technology is improving by leaps and bounds. Some commercial software can now tell the gender of a person in a photograph.
When the person in the photo is a white man, the software is right 99 percent of the time. But the darker the skin, the more errors arise — up to nearly 35 percent for images of darker skinned women, according to a new study that breaks fresh ground by measuring how the technology works on people of different races and gender.
These disparate results, calculated by Joy Buolamwini, a researcher at the M.I.T. Media Lab, show how some of the biases in the real world can seep into artificial intelligence, the computer systems that inform facial recognition.
Facial recognition algorithms made by Microsoft, IBM and Face++ were more likely to misidentify the gender of black women than white men. Gender was misidentified in up to 1 percent of lighter-skinned males in a set of 296 photos.
Gender was misidentified in up to 7 percent of lighter-skinned females in a set of 385 photos. Gender was misidentified in up to 12 percent of darker-skinned males in a set of 318 photos. Read more from nytimes.com…
thumbnail courtesy of nytimes.com