FB pixel

Chincotech tackles racial bias in facial recognition systems

 

Tokyo-based software company Chincotech has announced the development of a multi-racial facial recognition system to provide superior accuracy than traditional systems, which often have unacceptably high error rates for non-white individuals.

In tests of facial recognition systems by M.I.T. Media Lab Researcher Joy Buolamwini, gender was misidentified for less than 1 percent of lighter-skinned males, and up to 7 percent of lighter-skinned females, The New York Times reports. The same systems misidentified the gender of up to 12 percent of darker-skinned males, and a shocking 35 percent of darker-skinned females.

The datasets used to test facial recognition systems may be contributing to the problem, as one widely-used collection of images is estimated to be more than 75 percent male and more than 80 percent white. Haverford College computer scientist Sorelle Friedler, a reviewing editor on Buolamwini’s research paper (PDF), said that experts have suspected that the performance of facial recognition systems depends on the population being considered, and that the research is the first to empirically confirm the suspicion.

The paper, written by Buolamwini and Microsoft researcher Timnit Gebru, studied facial analysis systems from Microsoft, IBM, and Megvii.

Chincotech is combatting this challenge with a 3D transforming face algorithm that continuously learns multi-racial characteristics to accurately identify people in 2D pictures.

“Our tests have proved that this technique coupled with a system that is taught to learn the difference between races and you have a system that delivers significantly more accurate results,” said Chincotech Head Software Development Engineer Paul Rashford.

Buolamwini has given a TED Talk on coded bias, and advocates for algorithmic accountability as a founder of the Algorithmic Justice League.

As previously reported, University of Surrey researchers developed a multi-racial facial recognition system last year which delivers more accurate results than are typical.

This post was updated at 9:22am on July 27, 2021, to clarify that the Gender Shades study tests facial analysis algorithms, not identification algorithms.

Article Topics

 |   |   |   |   | 

Latest Biometrics News

 

India scales farmer ID system for payments with KPMG support

The India office of influential accounting firm KPMG has explained how it supported the advancement of the country’s Digital Agriculture…

 

Digital ID systems fail migrants due to policy gaps, Caribou finds

A new report by research organization Caribou has warned that digital ID systems around the world have continued to deepen…

 

Hopae launches eIDAS 2.0, AMLR onboarding readiness tool

Hopae has launched a free self-assessment tool to help financial institutions offering customer onboarding and identity verification to evaluate their…

 

Certainty vs flexibility – does the UK need a Biometric Surveillance Act?

By Professor Fraser Sampson, former UK Biometrics & Surveillance Camera Commissioner Last week London became a city of two tales. Two…

 

TestMu AI releases testing tool for agent-produced code

TestMu AI (formerly LambdaTest) has launched Kane CLI, “a new browser automation tool that runs directly from the terminal,” and…

 

Travel biometrics making new connections

Airport biometrics projects and companies are breaking new ground and intersecting with other industry trends, from digital wallets to biometric…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events