FB pixel

Facial recognition systems based on older methods show high levels of bias: Study

Facial recognition systems based on older methods show high levels of bias: Study
 

A new study has revealed troubling differences in how facial recognition systems relying on widely used, older methods in open-source packages detect the faces of people with darker skin compared to those with lighter skin.

Using generalized linear modeling, researchers demonstrated that the technology failed to detect faces in just 0.28 percent of cases for people with the lightest skin, but failed a staggering 24.34 percent of the time for those with the darkest skin.

Researchers associated with the U.S. government traced these failures back to outdated face detection methods commonly used in open-source software packages. These older algorithms appear to have been trained primarily on lighter-skinned individuals, creating blind spots when encountering people with darker complexions.

“We trace these issues to widely used, older methods in open-source packages for face detection,” says the paper. “Furthermore, this demographic differential is not observed when testing open-source packages using a different, more curated dataset.”

The study, titled “Performance Differentials in Deployed Biometric Systems Caused by Open-Source Face Detectors,” examined three face detectors.

The first detector (OpenCV) uses the pre-trained Haar cascade classifier implemented in OpenCV. The second detector (Dlib) is Dlib’s face detector based on the Histogram of Oriented Gradients feature descriptor combined with a linear Support Vector Machine. The third detector (DNN) employs OpenCV’s Deep Neural Network module, utilizing a pre-trained Single Shot Multibox Detector model with a ResNet-10 backbone.

The study used real-world testing scenarios with participants from diverse backgrounds to simulate how these systems would actually perform when deployed in the field. This approach revealed problems that might have gone unnoticed with traditional testing methods.

Interestingly, skin tone itself proved to be a better predictor of system failure than self-reported race, suggesting the bias stems from how the technology processes visual information rather than broader demographic categories.

The research highlights a critical gap in how AI systems are evaluated before being released to the public. When researchers tested the same technology using a more carefully selected dataset, the racial disparities disappeared entirely. One of the conclusions is that outdated, poorly-performing algorithms should be removed from widely-used software libraries and replaced with more accurate alternatives.

John Howard, principal scientist at the Identity and Data Sciences Lab at the Maryland Test Facility, who participated in the study, warned that the results have implications for OpenCV, one of the most widely taught methods in computer science curricula.

“Please don’t use OpenCV in publicly facing applications that deal with faces,” Howard says in a LinkedIn post.  “It shows almost 10x levels of skintone bias, failing to detect faces with darker skin-tones at significantly higher rates.”

The scientist is one of the co-editors of the ISO/IEC 19795-10, the international standard for how to test demographic differentials in biometric systems for bias. Another co-editor is Yevgeniy Sirotin, who also collaborated on the latest study.

Other authors include Cynthia M. Cook, Jerry L. Tipton and Laurie Cuffney from the Identity and Data Sciences Lab and Arun R. Vemury from the U.S. Department of Homeland Security.

Related Posts

Article Topics

 |   |   |   |   |   |   | 

Latest Biometrics News

 

Face biometrics use cases outnumbered only by important considerations

With face biometrics now used regularly in many different sectors and areas of life, stakeholders are asking questions about a…

 

Biometric Update Podcast explores identification at scale using browser fingerprinting

“Browser fingerprinting is this idea that modern browsers are so complex.” So says Valentin Vasilyev, Chief Technology Officer of Fingerprint,…

 

Passkeys now pervasive but passwords persist in enterprise authentication

Passkeys are here; now about those passwords. Specifically, passkeys are now prevalent in the enterprise, the FIDO Alliance says, with…

 

Pornhub returns to UK, but only for iOS users who verify age with Apple

In the UK, “wanker” is not typically a term of endearment. However, the case may be different for Pornhub, which…

 

Europol operated ‘shadow’ IT systems without data safeguards: Report

Europol has operated secret data analysis platforms containing large amounts of personal information, such as identity documents, without the security…

 

EU pushes AI Act deadlines for high-risk systems, including biometrics

The EU has reached a provisional agreement on changes to the AI Act that postpone rules on high-risk AI systems,…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events