FB pixel

Trueface evaluates bias factor in its biometric facial recognition model

 

biometric facial recognition

In a recent evaluation, Trueface investigated the bias factor in its biometric facial recognition technology by using a unique dataset called Fairface, labeled with ethnicity, gender and age groups, writes Cyrus Behroozi, computer vision software developer at Trueface, in a Medium post.

The sample included an almost equal number of information from each ethnicity and gender group to make sure all were represented adequately. Trueface compared the images to estimate a similarity score for each image pair.

“Since all the images in the dataset belong to different identities, the comparisons are considered impostor matches, and the distribution of similarity scores should be centered about 0,” Behroozi explains. “We then compute the false positive rate at varying similarity-score-thresholds ranging from 0 to 1 for comparisons among each of the ethnicity and gender groups.”

When the team tested the model on different ethnicities against thresholds, the best performer was noticed for white people, while the worst performance was noticed for East Asian, confirming there was some bias, but it had “minimal performance loss.”

According to Behroozi, in an unbiased facial recognition model the false positive rate should be minimal. The evaluation showed the model Trueface develops performs better with females than males, but the difference and bias are negligible. An unbiased facial recognition model can be easily deployed in any geographical location without having to retrain it or feed it new data, and it will deliver complementary results, regardless of ethnicity and gender. It can, for example, be used in offices in the U.S., Japan, Pakistan, Belarus and United Kingdom, as well as in grocery chains in different neighborhoods or cities, the company says. Detailed information about the evaluation can be reviewed on Medium.

When bias is detected in facial recognition algorithms, it is a sign that the results might not be accurate for all demographics. According to NIST Facial Recognition Vendor Test results, a number of algorithms have displayed high false positive rates for females and people with dark skin.

Trueface’s software was recently ranked seventh among 199 algorithms in genuine template comparison time, according to the latest Facial Recognition Vendor Test (FRVT) results from the National Institute of Standards and Technology (NIST).

Article Topics

 |   |   |   |   |   | 

Latest Biometrics News

 

Opinions on UK Online Safety Act emphasize importance of enforcement

Online safety legislation is making headlines around the world. But in places where laws have taken effect, are they proving…

 

UK Home Office raises estimate for passport contract to 12 years, £576M

The UK Home Office has opened a third round of market engagement for its next major passport manufacturing and personalization…

 

US lawmakers move to restrict AI chatbots used by kids

A bipartisan pair of House and Senate bills would impose new federal restrictions on AI chatbots, including a ban on…

 

Utah age assurance law for VPN users takes effect this week

Privacy advocates and virtual private network (VPN) providers are up in arms over Utah’s Senate Bill 73 (SB 73), “Online…

 

CLR Labs wins ISO 17025 accreditation for biometrics testing across EU

Cabinet Louis Reynaud (CLR Labs) has been accredited for ISO/IEC 17025, the international standard for testing and calibration laboratories, in…

 

Leidos, Idemia PS advance checkpoint modernization with biometrics, CAT-2 systems

Leidos and Idemia Public Security have formed a strategic partnership to deploy biometric‑enabled eGates and integrated Credential Authentication Technology (CAT-2)…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events