FB pixel

Facial recognition systems based on older methods show high levels of bias: Study

Facial recognition systems based on older methods show high levels of bias: Study
 

A new study has revealed troubling differences in how facial recognition systems relying on widely used, older methods in open-source packages detect the faces of people with darker skin compared to those with lighter skin.

Using generalized linear modeling, researchers demonstrated that the technology failed to detect faces in just 0.28 percent of cases for people with the lightest skin, but failed a staggering 24.34 percent of the time for those with the darkest skin.

Researchers associated with the U.S. government traced these failures back to outdated face detection methods commonly used in open-source software packages. These older algorithms appear to have been trained primarily on lighter-skinned individuals, creating blind spots when encountering people with darker complexions.

“We trace these issues to widely used, older methods in open-source packages for face detection,” says the paper. “Furthermore, this demographic differential is not observed when testing open-source packages using a different, more curated dataset.”

The study, titled “Performance Differentials in Deployed Biometric Systems Caused by Open-Source Face Detectors,” examined three face detectors.

The first detector (OpenCV) uses the pre-trained Haar cascade classifier implemented in OpenCV. The second detector (Dlib) is Dlib’s face detector based on the Histogram of Oriented Gradients feature descriptor combined with a linear Support Vector Machine. The third detector (DNN) employs OpenCV’s Deep Neural Network module, utilizing a pre-trained Single Shot Multibox Detector model with a ResNet-10 backbone.

The study used real-world testing scenarios with participants from diverse backgrounds to simulate how these systems would actually perform when deployed in the field. This approach revealed problems that might have gone unnoticed with traditional testing methods.

Interestingly, skin tone itself proved to be a better predictor of system failure than self-reported race, suggesting the bias stems from how the technology processes visual information rather than broader demographic categories.

The research highlights a critical gap in how AI systems are evaluated before being released to the public. When researchers tested the same technology using a more carefully selected dataset, the racial disparities disappeared entirely. One of the conclusions is that outdated, poorly-performing algorithms should be removed from widely-used software libraries and replaced with more accurate alternatives.

John Howard, principal scientist at the Identity and Data Sciences Lab at the Maryland Test Facility, who participated in the study, warned that the results have implications for OpenCV, one of the most widely taught methods in computer science curricula.

“Please don’t use OpenCV in publicly facing applications that deal with faces,” Howard says in a LinkedIn post.  “It shows almost 10x levels of skintone bias, failing to detect faces with darker skin-tones at significantly higher rates.”

The scientist is one of the co-editors of the ISO/IEC 19795-10, the international standard for how to test demographic differentials in biometric systems for bias. Another co-editor is Yevgeniy Sirotin, who also collaborated on the latest study.

Other authors include Cynthia M. Cook, Jerry L. Tipton and Laurie Cuffney from the Identity and Data Sciences Lab and Arun R. Vemury from the U.S. Department of Homeland Security.

Related Posts

Article Topics

 |   |   |   |   |   |   | 

Latest Biometrics News

 

Senegal data breach disrupts national ID issuance

The issuance of national ID cards in Senegal recently got halted on a temporary basis after the government reported a…

 

World’s success in LatAm is based on dubious grounds, says digital rights activist

Digital identity project World has nearly 40 million app users and over 17 million verified humans – many of whom…

 

Wizz joins Tech Coalition to back up claims its safety measures prevent sextortion

Wizz, which brands itself as “the social discovery app for GenZ to build community globally,” has announced in a release…

 

Djibouti unveils biometric mobile ID to enhance access to public services

Digital transformation efforts in Djibouti have gone a notch high with the launch of a biometrics-based mobile ID that seeks…

 

ICO hits Imgur owner with £250K fine for mishandling children’s data

Imgur, which suspended access for users in the UK in September 2025 over concerns about a forthcoming fine from the…

 

Discord to make teen settings default, Australia wants a word with Roblox

Discord is rolling out “teen-by-default” settings for all users globally. A release from the messaging platform says “all new and…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events