Research to improve facial biometrics-based gender classification LGBTQ inclusivity criticized

Research to improve facial biometrics-based gender classification LGBTQ inclusivity criticized

A research paper attempting to model gender as a continuum to reduce bias and increase inclusivity in facial recognition applications for gender classification has been published to Arxiv.org, but is meeting with heavy criticism from another researcher in the field, VentureBeat reports.

Gender Classification and Bias Mitigation in Facial Images’ was written by four researchers affiliated with Harvard and Autodesk, and suggests more inclusive databases are necessary to mitigate bias against LGBTQ and gender non-binary people.

University of Washington AI researcher Os Keyes, however, tells VB that the paper’s authors “go back and forth between treating gender as physiologically and visually modeled in a fixed way and being more flexible and contextual,” and expresses skepticism that anyone from the groups the paper focusses on was consulted.

Several groups and jurisdictions have focussed on demographic performance differences and allegations of bias in facial recognition while considering or calling for some or all uses of it to be banned. VentureBeat claims the Association for Computing Machinery (ACM) has called for a moratorium on all uses of facial recognition, although the actual statement from the group specifies applications that could “be prejudicial to established human and legal rights.”

The researchers say that a lack of LGBTQ representation in benchmark databases could mask a lack of performance of machine learning tools for gender classification, and attempt to improve the performance of gender classification systems by building both an “inclusive” database made up of 9 percent people who identify as LGBTQ and a second database exclusively of people who self-identify as non-binary.

That self-identification is a problematic qualification, according to Keyes, as is reference to a study that suggests all gender transformation procedures cause significant changes to people’s faces. The contentious study has been described as “junk science” by GLAAD and the Human Rights Campaign.

The researchers say their system results in 91.97 percent accuracy in classifying gender non-binary individuals. Keyes says non-consensual definition of gender cannot be “trans-inclusive” by nature.

Related Posts

Article Topics

 |   |   | 

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Read This Week

Featured Company

Biometrics Research

Biometrics White Papers

Biometrics Events

Explaining Biometrics