FB pixel

American researchers probe where biometric bias comes in and how to measure it

American researchers probe where biometric bias comes in and how to measure it
 

A pair of papers on why biometric systems are so often found to be less effective with some demographic groups and how to measure those disparities have been published by researchers from the Identity and Data Sciences Lab at the Maryland Test Facility.

John Howard of MdTF, which is used for DHS’ biometrics tests, pointed out the papers in a LinkedIn post.

‘Disparate impact in facial recognition stems from the broad homogeneity effect: A case study and method to resolve’ attributes the problem of biometric bias to “demographic clustering.” This is the phenomenon where the use of features determined (at least in part) by the gender or ethnicity of people increases similarity scores between individuals.

The paper shows that it is possible to remove feature patterns shared within demographic groups while keeping distinct features that can be used for facial recognition. The team used linear dimensionality techniques to increase the “fairness” of two ArcFace algorithms, as measured in four different ways, without lowering true match rates.

‘Evaluating proposed fairness models for face recognition algorithms’ considers the Fairness Discrepancy Rate (FDR) proposed by Idiap researchers and the Inequity Rate (IR) proposed by NIST researchers. Both metrics are found to be difficult to interpret due to inherent mathematical characteristics. The study authors therefore propose the Functional Fairness Measure Criteria (FFMC) to help with interpretations of the above metrics.

They also develop a new measure, the Gini Aggregation Rate for Biometric Equitability (GARBE). This measurement technique is based on the Gini coefficient, which is a statistical measure of dispersion typically used in measuring income inequality.

The work on an evaluation method is intended to directly support ISO 19795-10, which sets an international standard for bias in facial recognition.

Both papers appeared in the publication of the 26th International Conference on Pattern Recognition (ICPR 2022) Fairness in Biometrics Workshop.

Article Topics

 |   |   |   |   |   | 

Latest Biometrics News

 

GAO: Cybersecurity workforce management falls short, impacting security across the board

A new U.S. Government Accountability Office (GAO) audit found that despite notable advancements, federal departments still face substantial barriers to…

 

Next Biometrics: quarterlies, annuals, regulatory

January 17, 2025 – Changes to Aadhaar’s biometrics rules have prompted a temporary pause in Indian business that Next Biometrics says…

 

Biometric authentication required for SIM card registration in India and Thailand

Following a spike in mobile telecommunications fraud and other related crimes, the government of India has directed that all new…

 

Biden executive order prioritizes privacy-preserving digital ID, mDLs

In one of his last official acts as President, Joe Biden on Thursday issued a robust new executive order (EO)…

 

Problem with police use of facial recognition isn’t with the biometrics

A major investigation by the Washington Post has revealed that police in the U.S. regularly use facial recognition as the…

 

Sri Lanka considers another tender to solve passport crisis

Sri Lanka’s government is likely to open another tender for e-passports after a legal dispute caused a backlog of thousands…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events