FB pixel

Research to improve facial biometrics-based gender classification LGBTQ inclusivity criticized

Research to improve facial biometrics-based gender classification LGBTQ inclusivity criticized
 

A research paper attempting to model gender as a continuum to reduce bias and increase inclusivity in facial recognition applications for gender classification has been published to Arxiv.org, but is meeting with heavy criticism from another researcher in the field, VentureBeat reports.

Gender Classification and Bias Mitigation in Facial Images’ was written by four researchers affiliated with Harvard and Autodesk, and suggests more inclusive databases are necessary to mitigate bias against LGBTQ and gender non-binary people.

University of Washington AI researcher Os Keyes, however, tells VB that the paper’s authors “go back and forth between treating gender as physiologically and visually modeled in a fixed way and being more flexible and contextual,” and expresses skepticism that anyone from the groups the paper focusses on was consulted.

Several groups and jurisdictions have focussed on demographic performance differences and allegations of bias in facial recognition while considering or calling for some or all uses of it to be banned. VentureBeat claims the Association for Computing Machinery (ACM) has called for a moratorium on all uses of facial recognition, although the actual statement from the group specifies applications that could “be prejudicial to established human and legal rights.”

The researchers say that a lack of LGBTQ representation in benchmark databases could mask a lack of performance of machine learning tools for gender classification, and attempt to improve the performance of gender classification systems by building both an “inclusive” database made up of 9 percent people who identify as LGBTQ and a second database exclusively of people who self-identify as non-binary.

That self-identification is a problematic qualification, according to Keyes, as is reference to a study that suggests all gender transformation procedures cause significant changes to people’s faces. The contentious study has been described as “junk science” by GLAAD and the Human Rights Campaign.

The researchers say their system results in 91.97 percent accuracy in classifying gender non-binary individuals. Keyes says non-consensual definition of gender cannot be “trans-inclusive” by nature.

Related Posts

Article Topics

 |   |   | 

Latest Biometrics News

 

Canada regulator backs privacy-preserving age assurance

The Office of the Privacy Commissioner of Canada (OPC) has published a policy note and guidance documents pertaining to age…

 

FCC seeks comment on KYC revision for commercial phone calls

The U.S. Federal Communications Commission (FCC) has proposed stronger KYC requirements for voice service providers to prevent scams and illegal…

 

Deepfake detection upgrade for Sumsub highlights continuous self-improvement

Sumsub has launched an upgrade to its deepfake detection product with instant online self-learning updates to address rapidly evolving fraud…

 

Metalenz debuts under-display camera for payment-grade face authentication

Unlocking a smartphone with your face used to require a camera placed in a notch or a punch hole in…

 

UK regulators pan patchwork policy for law enforcement facial recognition

The UK’s two Biometrics Commissioners shared cautionary observations about the use of facial recognition in law enforcement over the weekend…

 

IDV spending to hit $29B by 2030 as DPI projects scale: Juniper Research

Spending on digital identity verification (IDV) technology is projected to reach a 55 percent growth rate between now and 2030,…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events