REALITY TV
TV
MOVIES
MUSIC
CELEBRITY
About Us Contact Us Privacy Policy Terms of Use Accuracy & Fairness Corrections & Clarifications Ethics Code Your Ad Choices
© MEAWW All rights reserved
MEAWW.COM / NEWS / SCIENCE

Facial recognition software often fails to recognize transgenders giving rise to stereotypes on what sexes should look like

Those who identified as agender, genderqueer or non-binary, indicating that they identify as neither male or female, were mischaracterized 100% of the time.
UPDATED FEB 26, 2020
(Source : Getty Images)
(Source : Getty Images)

Your phone or a hidden camera in your vicinity may not be doing a good job at recognizing people who are transgenders or conform to non-binary genders, according to the findings of new a study. The facial recognition software in them, says the research team, tends to misidentify transgender people more than one-third of the time.

"We found that facial analysis services performed consistently worse on transgender individuals, and were universally unable to classify non-binary genders," says lead author Morgan Klaus Scheuerman, a PhD student in the Information Science department. "While there are many different types of people out there, these systems have an extremely limited view of what gender looks like," adds Scheuerman.

According to the researchers, the findings suggest the computer vision systems, which run all our facial detection or facial analysis, do not handle the level of gender diversity we live with every day.

The researchers also tested whether the software made stereotypical judgments. When Scheuerman, who is male and has long hair, submitted his own picture, the software half-categorized him as female.

"These systems run the risk of reinforcing stereotypes of what you should look like if you want to be recognized as a man or a woman. And that impacts everyone," says Scheuerman.

The software has erred before too. Previous reports have shown that they misidentify women of color as much as one-third of the time while their accuracy in terms of assessing the gender of white men, was on point. This led the research team to suspect that the software may be showing a bias around gender.

For this, the team turned to Instagram and collected images of faces bearing a hashtag indicating their gender identity, provided by the owners. They divided the 2,450 images  into seven groups ⁠— #women, #man, #transwoman, #transman, #agender, #agenderqueer, #nonbinary ⁠— each containing 350 images. These images were then fed into the largest providers of facial analysis services: IBM, Amazon, Microsoft and Clarifai.

The software belonging to these companies, on average, performed well while identifying cisgender women and cisgender men, getting it right 98.3% and 97.6% of the time. On the other hand, the software messed up while categorizing trans men, who were identified as women up to 38% of the time.

The software also performed poorly while categorizing people belonging to agender, genderqueer or nonbinary — who identified themselves as neither male or female — they were wrong 100% of the time.

"These systems don't know any other language but male or female, so for many gender identities it is not possible for them to be correct," senior author Jed Brubaker, an assistant professor of Information Science.

The bias stems from how the software was trained, according to the team. But they could not get access to the training data, or image inputs used to "teach" the system what male and female looks like.

"When you walk down the street you might look at someone and presume that you know what their gender is, but that is a really quaint idea from the 90s and it is not what the world is like anymore. "As our vision and our cultural understanding of what gender is has evolved. The algorithms driving our technological future have not. That's deeply problematic," says Brubaker.

POPULAR ON MEAWW
MORE ON MEAWW