FB pixel

Trueface evaluates bias factor in its biometric facial recognition model

 

biometric facial recognition

In a recent evaluation, Trueface investigated the bias factor in its biometric facial recognition technology by using a unique dataset called Fairface, labeled with ethnicity, gender and age groups, writes Cyrus Behroozi, computer vision software developer at Trueface, in a Medium post.

The sample included an almost equal number of information from each ethnicity and gender group to make sure all were represented adequately. Trueface compared the images to estimate a similarity score for each image pair.

“Since all the images in the dataset belong to different identities, the comparisons are considered impostor matches, and the distribution of similarity scores should be centered about 0,” Behroozi explains. “We then compute the false positive rate at varying similarity-score-thresholds ranging from 0 to 1 for comparisons among each of the ethnicity and gender groups.”

When the team tested the model on different ethnicities against thresholds, the best performer was noticed for white people, while the worst performance was noticed for East Asian, confirming there was some bias, but it had “minimal performance loss.”

According to Behroozi, in an unbiased facial recognition model the false positive rate should be minimal. The evaluation showed the model Trueface develops performs better with females than males, but the difference and bias are negligible. An unbiased facial recognition model can be easily deployed in any geographical location without having to retrain it or feed it new data, and it will deliver complementary results, regardless of ethnicity and gender. It can, for example, be used in offices in the U.S., Japan, Pakistan, Belarus and United Kingdom, as well as in grocery chains in different neighborhoods or cities, the company says. Detailed information about the evaluation can be reviewed on Medium.

When bias is detected in facial recognition algorithms, it is a sign that the results might not be accurate for all demographics. According to NIST Facial Recognition Vendor Test results, a number of algorithms have displayed high false positive rates for females and people with dark skin.

Trueface’s software was recently ranked seventh among 199 algorithms in genuine template comparison time, according to the latest Facial Recognition Vendor Test (FRVT) results from the National Institute of Standards and Technology (NIST).

Article Topics

 |   |   |   |   |   | 

Latest Biometrics News

 

New international standard covers forensic biometrics in lab and field environments

A lunch talk by the European Association for Biometrics (EAB) untangles the forthcoming Forensic Sciences standard ISO/IEC 21043, in particular its…

 

Retail biometrics’ rising tide lifts startup to €1.5M funding round

Lithuania-based autonomous retail startup Pixevia has raised €1.5 million (US$1.6 million), and plans to bring its technology to new venues…

 

Humane AI reportedly looking for a buyer for its biometric wearable business

mane AI is reportedly in search of a potential buyer for its business, as per a report by Bloomberg. The…

 

With AI tools and crypto aiding financial fraud, cops and industry gear up for a fight

The fraudulent practice of “pig butchering” continues to be a major concern in financial fraud, as highlighted in a recent…

 

ID R&D patents voice biometrics and spoof detection to protect smart devices

ID R&D has been granted a patent for integrated voice biometrics and spoof detection for keywords and short requests used…

 

Worldcoin ordered to stop operating in Hong Kong

Hong Kong’s privacy office has announced the results of its investigation into Worldcoin, and found that the iris biometrics project…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Read This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events