With a brief glance at a single face, emerging facial recognition software can now categorize the gender of many men and women with remarkable accuracy. But if that face belongs to a transgender person, such systems get it wrong more than one third of the time, according to new 麻豆免费版下载Boulder research.
鈥淲e found that facial analysis services performed consistently worse on transgender individuals, and were universally unable to classify non-binary genders,鈥 said lead author Morgan Klaus Scheuerman, a PhD student in the Information Science department. 鈥淲hile there are many different types of people out there, these systems have an extremely limited view of what gender looks like.鈥
The comes at a time when facial analysis technologies鈥攚hich use hidden cameras to assess and characterize certain features about an individual鈥攁re becoming increasingly prevalent, embedded in everything from smartphone dating apps and digital kiosks at malls to airport security and law enforcement surveillance systems.
Previous research suggests they tend to be most accurate when assessing the gender of white men, but misidentify women of color as much as one-third of the time.
鈥淲e knew there were inherent biases in these systems around race and ethnicity and we suspected there would also be problems around gender,鈥 said senior author Jed Brubaker, an assistant professor of Information Science. 鈥淲e set out to test this in the real world.鈥
For some gender identities, accuracy is impossible
Researchers collected 2,450 images of faces from Instagram, each of which had been labeled by its owner with a hashtag indicating their gender identity. The pictures were then divided into seven groups of 350 images (#women, #man, #transwoman, #transman, #agender, #agenderqueer, #nonbinary) and analyzed by four of the largest providers of facial analysis services (IBM, Amazon, Microsoft and Clarifai).
As our vision and our cultural understanding of what gender is has evolved, the algorithms driving our technological future have not. That鈥檚 deeply problematic."
- Jed Brubaker
Notably, Google was not included because it does not offer gender recognition services.
On average, the systems were most accurate with photos of cisgender women (those born female and identifying as female), getting their gender right 98.3% of the time. They categorized cisgender men accurately 97.6% of the time.
But trans men were wrongly identified as women up to 38%听of the time. And those who identified as agender, genderqueer or nonbinary鈥攊ndicating that they identify as neither male or female鈥攚ere mischaracterized 100%听of the time.
听鈥淭hese systems don鈥檛 know any other language but male or female, so for many gender identities it is not possible for them to be correct,鈥 said听Brubaker.
Outdated stereotypes persist
The study also suggests that such services identify gender based on outdated stereotypes.
When Scheuerman, who is male and has long hair, submitted his own picture, half categorized him as female.
The researchers could not get access to the training data, or image inputs used to 鈥渢each鈥 the system what male and female looks like, but previous research suggests they assess things like eye position, lip fullness, hair length and even clothing.
鈥淭hese systems run the risk of reinforcing stereotypes of what you should look like if you want to be recognized as a man or a woman. And that impacts everyone,鈥 said Scheuerman.
The market for facial recognition services is projected to double by 2024, as tech developers work to improve human-robot interaction and more carefully target ads to shoppers.
鈥淭hey want to figure out what your gender is, so they can sell you something more appropriate for your gender,鈥 explains Scheuerman, pointing to one highly publicized incident of a mall in Canada which used a hidden camera in a kiosk to do this.
Already, Brubaker noted, people engage with facial recognition technology every day to gain access to their smartphones or log into their computers.听 If it has a tendency to misgender certain听populations that are听already vulnerable, that could have grave consequences.
For instance, a match-making app could set someone up on a date with the wrong gender, leading to a potentially dangerous situation. Or a mismatch between the gender a facial recognition program sees and the documentation a person carries could lead to problems getting through airport security, said听Scheuerman.听 He is most concerned that such systems reaffirm notions that transgender people don鈥檛 fit in.
鈥淧eople think of computer vision as futuristic, but there are lots of people who could be left out of this so-called future,鈥 he said.
The authors say they鈥檇 like to see tech companies move away from gender classification entirely and stick to more specific labels like 鈥渓ong hair鈥 or 鈥渕ake-up鈥 when assessing images.
听鈥淲hen you walk down the street you might look at someone and presume that you know what their gender is, but that is a really quaint idea from the 鈥90s and it is not what the world is like anymore,鈥 said Brubaker. 鈥淎s our vision and our cultural understanding of what gender is has evolved, the algorithms driving our technological future have not. That鈥檚 deeply problematic.鈥
The research will be presented in November at the ACM Conference on Computer Supported Cooperative Work in Austin, Texas. 听