FB pixel

Amazon claims to cut facial recognition bias with unlabeled data

Race, gender bias reported for UK police biometrics
Amazon claims to cut facial recognition bias with unlabeled data
 

Amazon Web Services has unveiled a method to evaluate bias in facial recognition algorithms without using annotated identity labels.

A research paper on the topic, spotted by MarktechPost, describes a way to detect performance disparities that are indicative of bias.

The method, called Semi-supervised Performance Evaluation for Face Recognition, or SPE-FR, reportedly detects disparities despite the fact that it only estimates a model’s performance based on data from different demographic groups.

SPE-FR could make evaluating models for bias “much more practical for creators of face recognition software,” according to the Post article.

“It can be especially useful to companies and agencies prior to system adoption who may otherwise be unable to estimate system performance or detect potential biases because they cannot collect reliable identity annotations for their data,” according to the paper.

In experiments, scientists trained face biometrics models on data in which specific demographic information had been hidden specifically to create bias. Amazon’s machine learning models consistently spotted differential performance variations in these altered datasets.

In fact, SPE-FR outperformed Bayesian calibration.

“SPE-FR can be applied off-the-shelf to a wide range of face embedding models with state-of-the-art designs and trained on different datasets,” reads the paper.

Biometric Services Gateway under scrutiny in UK

Biometric bias is discussed in another report as well, this one focusing on the increased use of the Biometric Services Gateway, mobile fingerprinting hardware and software, by UK police forces and a live facial recognition pilot deployed by South Wales police.

The Gateway was first deployed in the UK in 2018. They allow police officers to scan a print and compare it to police and immigration databases. The latest report by civil rights advocates Racial Justice Network and Yorkshire Resists follows up a previous ‘Stop the Scan’ report from last year.

The document is based on freedom of information responses from 35 police agencies. (Eleven refused to respond, citing lack of time and resources.)

Spotted by The Justice Gap, the report claims that the gateway is not used in an objective way by police. Officers can demand that anyone submit to a biometric scan if they feel a person has committed a crime or is lying about their identity.

The report suggests that Black people are four times more likely to be stopped and biometrically scanned than white people. Asians are twice as likely to be stopped. Men are about 12 times more likely to be stopped and scanned than women.

The unequal proportion of queries could have been exacerbated by biometric algorithms used in South Wales Police’s facial recognition program that was ended in 2020, the advocacy groups argue.

The study points to NIST data from 2019 showing that some algorithms were 10 to 100 times more likely to misidentify a Black or East Asian than a white face. Not all of the algorithms evaluated are in commercial production, however, and others were found to have imperceptible differences in performance between demographics, prompting NIST Biometric Standards and Testing Lead Patrick Grother to urge those implementing facial recognition to be specific in evaluating bias.

Article Topics

 |   |   |   |   |   |   |   |   | 

Latest Biometrics News

 

Biometrics providers and systems evolve or get left behind

Biometrics are allowing people to prove who they are, speeding journeys through airports, and enabling anonymous online proof of age,…

 

Findynet funding development of six digital wallet solutions

Finnish public-private cooperative Findynet has announced it will award 60,000 euros (US$69,200) to six digital wallet vendors to help translate…

 

Patchwork of age check, online safety legislation grows across US

As the U.S. waits for the Supreme Court’s opinion on the Texas case of Paxton v. Free Speech Coalition, which…

 

AVPA laud findings from age assurance tech trial

The Age Verification Providers Association (AVPA), and several of its members, have welcomed the publication of preliminary findings from the…

 

Sri Lanka to launch govt API policies and guidelines

Sri Lanka’s government, in the wake of its digital economy drive, is gearing up to release application programming interface (API)…

 

Netherlands’ asylum seeker ID cards from Idemia use vertical ICAO format

The Netherlands will introduce new identity documents for asylum seekers Idemia Smart Identity, compliant with the ICAO specification for vertical…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events