Researcher says facial recognition models perpetuate discrimination against trans people
Facial recognition research papers follow a model of gender which is binary more than 90 percent of the time, immutable more than 70 percent of the time, and consider gender a purely physiological feature more than 80 percent of the time if focused specifically on gender, according to research reported by Motherboard.
University of Washington researcher Os Keyes studied 58 research papers to produce “The Misgendering Machines: Trans/HCI Implications of Automatic Gender Recognition,” which explores the ubiquity of automatic gender recognition (AGR), and how models based on the above assumptions perpetuate and increase discrimination against transgender people.
“I couldn’t help but be personally, as well as professionally annoyed by the approach that the field took to gender—of assuming these two very monolithic and universal categories of gendered experience,” Keyes told Motherboard. “Pretty much every paper I read did it.”
Keyes says that only 3 papers of the 58 research papers focused on trans people, and none on non-binary trans people. The appropriate way forward, according to Keyes, is for social sciences such as ethics and gender studies to be brought to the focus of computer science students, but also for technologies such as AGR to be deployed according to need.
“Technologies need to be contextual and need-driven,” Keyes says. “What are the values of the people who use the space that you’re deploying a technology in? Do the people in that space actually need it? If we’re not discussing gender at all, or race at all…it doesn’t necessarily lead to a better world.”
Even in cases where people identify their gender in traditional, binary ways, facial recognition technologies are not always sufficiently accurate, as shown in research recently presented by Joy Buolamwini at the World Economic Forum in Davos.
Article Topics
accuracy | algorithms | biometrics | ethics | facial recognition | gender recognition
Comments