Biometrics is a growing field impacting virtually every market, and, according to new research from the University of Colorado Boulder, these systems have trouble accurately identifying transgender people.
Biometrics includes technology that relies on the human body as a form of identification such as facial recognition, fingerprint recognition, iris recognition and even full body recognition. The research found that biometric systems such as these get identification wrong more than one-third of the time.
"We found that facial analysis services performed consistently worse on transgender individuals, and were universally unable to classify non-binary genders," said Morgan Klaus Scheuerman, a Ph.D. student in the Information Science department at the University of Colorado Boulder. "While there are many different types of people out there, these systems have an extremely limited view of what gender looks like."
A growing trend in security services and financial establishments is to use hidden cameras in order to protect assets and identify people coming and going. Increasingly, these cameras are also being embedded in everything from smartphone dating apps and digital kiosks at malls to airport security and law enforcement surveillance systems.
While previous research has found facial recognition biometrics to be accurate when assessing the gender of white men, it has had trouble identifying women of color as much as one-third of the time.
"We knew there were inherent biases in these systems around race and ethnicity and we suspected there would also be problems around gender," said Jed Brubaker, an assistant professor of Information Science at the University of Colorado Boulder. "We set out to test this in the real world."
The team collected 2,450 images of faces from Instagram and labeled each photo with a hashtag to indicate gender and then divided them into seven groups that were then analyzed by four of the largest providers of facial analysis services — IBM, Amazon, Microsoft and Clarifai.
The systems were most accurate with photos of cisgender women (those born female and identifying as female), getting the gender right 98.3% of the time. The systems also categorized cisgender men accurately 97.6% of the time. However, trans men were wrongly identified as women up to 38% of time and those who identified as agender, genderqueer or nonbinary were mischaracterized 100% of the time.
The problem, researchers said, is that these systems don’t understand any language other than male or female, so it is impossible for them to be correct with different identities. Additionally, these systems also identify gender based on outdated stereotypes. For example, Scheuerman, who is male and has long hair, submitted his own picture and half of the systems categorized him as female.
The market for facial recognition biometrics is expected to double by 2024 as companies seek to improve human-robot interaction and more carefully target ads to shoppers, researchers said. In order to improve not only these services but also how to identify all genders, researchers suggest that more companies move away from gender classification entirely and stick to more specific labels such as “long hair” or “make-up” when assessing images.