FB pixel

Trueface evaluates bias factor in its biometric facial recognition model

biometric facial recognition

In a recent evaluation, Trueface investigated the bias factor in its biometric facial recognition technology by using a unique dataset called Fairface, labeled with ethnicity, gender and age groups, writes Cyrus Behroozi, computer vision software developer at Trueface, in a Medium post.

The sample included an almost equal number of information from each ethnicity and gender group to make sure all were represented adequately. Trueface compared the images to estimate a similarity score for each image pair.

“Since all the images in the dataset belong to different identities, the comparisons are considered impostor matches, and the distribution of similarity scores should be centered about 0,” Behroozi explains. “We then compute the false positive rate at varying similarity-score-thresholds ranging from 0 to 1 for comparisons among each of the ethnicity and gender groups.”

When the team tested the model on different ethnicities against thresholds, the best performer was noticed for white people, while the worst performance was noticed for East Asian, confirming there was some bias, but it had “minimal performance loss.”

According to Behroozi, in an unbiased facial recognition model the false positive rate should be minimal. The evaluation showed the model Trueface develops performs better with females than males, but the difference and bias are negligible. An unbiased facial recognition model can be easily deployed in any geographical location without having to retrain it or feed it new data, and it will deliver complementary results, regardless of ethnicity and gender. It can, for example, be used in offices in the U.S., Japan, Pakistan, Belarus and United Kingdom, as well as in grocery chains in different neighborhoods or cities, the company says. Detailed information about the evaluation can be reviewed on Medium.

When bias is detected in facial recognition algorithms, it is a sign that the results might not be accurate for all demographics. According to NIST Facial Recognition Vendor Test results, a number of algorithms have displayed high false positive rates for females and people with dark skin.

Trueface’s software was recently ranked seventh among 199 algorithms in genuine template comparison time, according to the latest Facial Recognition Vendor Test (FRVT) results from the National Institute of Standards and Technology (NIST).

Related Posts

Article Topics

 |   |   |   |   |   | 

Latest Biometrics News

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Read This Week

Featured Company

Biometrics Research

Biometrics White Papers

Biometrics Events

Explaining Biometrics