FB pixel

New paper explores facial recognition biases beyond demographics

New paper explores facial recognition biases beyond demographics
 

The well-reported biases surrounding skin tone and sex are not the only biases held by facial recognition systems, finds a new paper, which calls for far more development of face biometric systems in order for them to be fair.

A Comprehensive Study on Face Recognition Biases Beyond Demographics’ from a German-Spanish team tested the FaceNet and ArcFace facial recognition models with the MAAD-face dataset which has more than 120 million attribute annotations for 3.3 million face images, to see whether the models returned biases beyond explicit demographics such as age, sex and skin tone.

The team also tested non-explicit demographic attributes such as accessories, hairstyles and colors, face shapes, facial anomalies and make-up.

The early release of the full paper, which could be subject to further editing on final publication, includes the results for the different attributes tested, generating revealing graphs for the level of bias for both FaceNet and ArcFace, plotting the attributes which lead to bias causing a degraded level of biometric recognition, and those that lead to enhanced recognition performance.

Having a moustache, goatee, round face, an obstructed forehead or rosy cheeks or wearing lipstick or glasses can all lead to degraded recognition. Having gray hair improves it, as does having a beard or even just 5 o’clock shadow compared to no beard.

The authors were able to explain some of the reasons behind the results, but not all. “The findings of this work strongly demonstrate the need for further advances in making face recognition systems more robust, explainable, and fair. We hope these findings lead to the development of more robust and unbiased face recognition solutions,” concludes the paper.

AnyVision recently called on companies developing biometrics and AI algorithms to remove demographic bias, in response to the to the U.S. National Institute of Standards and Technology’s (NIST’s) call for public comment on its proposed method for evaluating user trust in AI systems. OpenAI has admitted demographic biases in its new computer vision model.

Article Topics

 |   |   |   |   |   |   | 

Latest Biometrics News

 

Maryland bill on police use of facial recognition is ‘strongest law in the nation’

Maryland has passed one of the more stringent laws governing the use of facial recognition technology by law enforcement in…

 

Immigrant and civil rights groups urge govt to ban own use of FRT, limit private use

Rights groups continue to call on the U.S. government to limit governmental use of facial recognition technology. Digital rights group…

 

Kenya raises issuance targets for digital IDs and passports

Everything being equal, Kenya plans to issue at least three million digital national IDs and one million biometric passports before…

 

IOM and Japan back biometrics at Sri Lanka ports of entry

Biometric technology use continues to grow at airports around the world. Air transport industry IT provider SITA predicts that by…

 

Kuwait fingerprints 2M as biometric data registration deadline nears

Kuwait is finalizing its program of collecting fingerprints for the country’s central biometric database as the June deadline for completion….

 

Xperix OCR software deployed for Brazil border control in deal with Akiyama Group

South Korea-based biometrics provider Xperix Inc. has announced the successful integration of its RealPass-N Optical Character Recognition (OCR) algorithm and…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Read From This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events