Facial recognition software unable to recognise trans people, university study suggests

Facial recognition software unable to recognise trans people, university study suggestsFacial recognition software is unable to recognise trans people, a university study has suggested. The University of Colorado Boulder in the US set out to investigate the accuracy of facial analysis technology with transgender people and those who classify themselves as gender non-binary. Researches collected almost 2,500 images of faces from Instagram which had a hashtag indicating their gender identity, including women, men, transwoman, transman, agender, agenderqueer, nonbinary.  The images were then analysed by four of the largest providers of facial analysis services, IBM, Amazon, Microsoft and Calrifai.  Researchers found that on average the systems were most accurate with photos of cisgender women, getting their gender right 98.3 per cent of the time. Cisgender men were categorised accurately 97.6 per cent of the time. How facial recognition technology works However, while the facial recognition software is often accurate, the researchers found that it struggled to identify transgender people. Trans men, however, were wrongly identified as women up to 38 per cent of the time. And those who identified as agender, genderqueer or nonbinary – people who do not identify as either male or female – were mischaracterised 100 per cent of the time. Responding to the results, Dr Jane Hamlin, president of the Beaumont Society, a transgender support group, said: “It is unfortunate that these service providers are still stuck in the past with the out-dated notion of just two discrete genders.” “Fortunately, the rest of us are moving on and recognise that there is a range of gender identities that better match the real situation.  “Clearly more work needs to be done in ensuring that this software – if it is to be used, and that is contentious in some quarters – is much more accurate and reflects the rich diversity of people all around us. Misgendering people, for whatever reason, is thoughtless and hurtful.” The study comes as the use of facial recognition tech – using hidden cameras to assess certain features about an individual – is becoming increasingly prevalent, embedded in everything from smartphone dating apps to law enforcement. Previous research suggests that the technology is most accurate when assessing the gender of white men, but misidentifies women of colour as much as one-third of the time. FACIAL RECOGNITION: ITS PAST, PRESENT AND FUTURE Study lead author Morgan Klaus Scheuerman, a PhD student at the University of Colorado Boulder in the US, said: "We found that facial analysis services performed consistently worse on transgender individuals, and were universally unable to classify non-binary genders. "While there are many different types of people out there, these systems have an extremely limited view of what gender looks like". Dr Jed Brubaker, an Assistant Professor of Information Science at UC Boulder, added: "We knew there were inherent biases in these systems around race and ethnicity and we suspected there would also be problems around gender. We set out to test this in the real world." "These systems don't know any other language but male or female, so for many gender identities it is not possible for them to be correct." The research also suggests that facial recognition services also suggests that such services identify gender based on outdated stereotypes. When Mr Scheuerman, who is male and has long hair, submitted his own picture, half the services categorised him as female. He added: "These systems run the risk of reinforcing stereotypes of what you should look like if you want to be recognised as a man or a woman. And that impacts everyone." The researchers could not get access to the training data or image inputs used to "teach" the system what male and female looks like, but previous research suggests they assess things like eye position, lip fullness, hair length and even clothing. Facial recognition, fake views and virtual excursions – is this the cruise ship of the future? The market for facial recognition services is projected to double by 2024 and, already, many people engage with the technology every day to gain access to their smartphones or log into their computers. That has bred concerns that there could be grave consequences if certain vulnerable populations are consistently misgendered. Alternatively, a mismatch between the gender a facial recognition camera sees and the documentation a person carries could lead to problems getting through airport security. Mr Sheuerman is most concerned that it will reaffirm notions that transgender people don't fit in. He said: "People think of computer vision as futuristic, but there are lots of people who could be left out of this so-called future." The authors would like tech companies to move away from gender classification entirely and stick to more specific labels like "long hair" or "make-up" when assessing images. Dr Brubaker added: "When you walk down the street you might look at someone and presume that you know what their gender is, but that is a really quaint idea from the 90s and it is not what the world is like anymore. "As our vision and our cultural understanding of what gender is has evolved, the algorithms driving our technological future have not. That's deeply problematic." The research will be presented in November at the ACM Conference on Computer Supported Cooperative Work in Austin, Texas.



Yahoo News – Latest News & Headlines