FB pixel

New paper explores facial recognition biases beyond demographics

New paper explores facial recognition biases beyond demographics
 

The well-reported biases surrounding skin tone and sex are not the only biases held by facial recognition systems, finds a new paper, which calls for far more development of face biometric systems in order for them to be fair.

A Comprehensive Study on Face Recognition Biases Beyond Demographics’ from a German-Spanish team tested the FaceNet and ArcFace facial recognition models with the MAAD-face dataset which has more than 120 million attribute annotations for 3.3 million face images, to see whether the models returned biases beyond explicit demographics such as age, sex and skin tone.

The team also tested non-explicit demographic attributes such as accessories, hairstyles and colors, face shapes, facial anomalies and make-up.

The early release of the full paper, which could be subject to further editing on final publication, includes the results for the different attributes tested, generating revealing graphs for the level of bias for both FaceNet and ArcFace, plotting the attributes which lead to bias causing a degraded level of biometric recognition, and those that lead to enhanced recognition performance.

Having a moustache, goatee, round face, an obstructed forehead or rosy cheeks or wearing lipstick or glasses can all lead to degraded recognition. Having gray hair improves it, as does having a beard or even just 5 o’clock shadow compared to no beard.

The authors were able to explain some of the reasons behind the results, but not all. “The findings of this work strongly demonstrate the need for further advances in making face recognition systems more robust, explainable, and fair. We hope these findings lead to the development of more robust and unbiased face recognition solutions,” concludes the paper.

AnyVision recently called on companies developing biometrics and AI algorithms to remove demographic bias, in response to the to the U.S. National Institute of Standards and Technology’s (NIST’s) call for public comment on its proposed method for evaluating user trust in AI systems. OpenAI has admitted demographic biases in its new computer vision model.

Article Topics

 |   |   |   |   |   |   | 

Latest Biometrics News

 

Global IDV partnerships drive biometric integration in finance

Key partnerships have recently been announced bringing biometric verification systems to a growing number of financial services. AstroPay is integrating…

 

Singapore banks to roll out Singpass face verification

The Monetary Authority of Singapore (MAS) and the Association of Banks in Singapore (ABS) have announced that major retail banks…

 

Chinese hacking compromised hundreds of thousands of devices containing personal PII

The US Department of Justice (DOJ) announced Wednesday that the Federal Bureau of Investigation (FBI) sought and obtained a court-authorized…

 

US needs facial recognition legislation, NIST guidance to protect civil rights: report

Facial recognition’s benefits for law enforcement and civil applications run by America’s federal government could be outweighed by its negative…

 

Nigerian leader pledges support for digital ID expansion amid DPI investment plans

Speaking through a representative at an event to mark 2024 Identity Day this week, Nigerian President Bola Tinubu highlighted the…

 

Nigerian digital ID startup Regfyl raises $1.1M to address Africa’s AML compliance challenges

Regfyl, a Nigerian digital identity verification and fraud detection startup, has secured $1.1 million in a pre-seed funding round. This…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Read This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events