FB pixel

Chincotech tackles racial bias in facial recognition systems

 

Tokyo-based software company Chincotech has announced the development of a multi-racial facial recognition system to provide superior accuracy than traditional systems, which often have unacceptably high error rates for non-white individuals.

In tests of facial recognition systems by M.I.T. Media Lab Researcher Joy Buolamwini, gender was misidentified for less than 1 percent of lighter-skinned males, and up to 7 percent of lighter-skinned females, The New York Times reports. The same systems misidentified the gender of up to 12 percent of darker-skinned males, and a shocking 35 percent of darker-skinned females.

The datasets used to test facial recognition systems may be contributing to the problem, as one widely-used collection of images is estimated to be more than 75 percent male and more than 80 percent white. Haverford College computer scientist Sorelle Friedler, a reviewing editor on Buolamwini’s research paper (PDF), said that experts have suspected that the performance of facial recognition systems depends on the population being considered, and that the research is the first to empirically confirm the suspicion.

The paper, written by Buolamwini and Microsoft researcher Timnit Gebru, studied facial analysis systems from Microsoft, IBM, and Megvii.

Chincotech is combatting this challenge with a 3D transforming face algorithm that continuously learns multi-racial characteristics to accurately identify people in 2D pictures.

“Our tests have proved that this technique coupled with a system that is taught to learn the difference between races and you have a system that delivers significantly more accurate results,” said Chincotech Head Software Development Engineer Paul Rashford.

Buolamwini has given a TED Talk on coded bias, and advocates for algorithmic accountability as a founder of the Algorithmic Justice League.

As previously reported, University of Surrey researchers developed a multi-racial facial recognition system last year which delivers more accurate results than are typical.

This post was updated at 9:22am on July 27, 2021, to clarify that the Gender Shades study tests facial analysis algorithms, not identification algorithms.

Article Topics

 |   |   |   |   | 

Latest Biometrics News

 

IN Groupe restructures to maximize customer value from Idemia acquisition

IN Groupe has restructured and introduced a new executive committee as the company continues its transformation following the acquisition of…

 

Border biometrics, advanced authorization present potential $400B global windfall

Border modernization with biometrics and advanced travel authorization is a strategic imperative that could add billions in economic growth and…

 

UK Home Office live facial recognition adoption begins with POC at ports

The UK government is joining its police in embracing facial recognition with the Home Office and Immigration Enforcement planning a…

 

Australian eSafety Commissioner announces platforms covered by social media law

Comments made this week at a press conference held by Australia’s Minister for Communications and eSafety Commissioner aim to bring…

 

Physical ID, private sector alternatives pitched to save UK digital identity plan

The UK government’s plans for a national identification scheme have so far focused largely on online processes carried out with…

 

Au10tix and international brands to reveal insights on AI fraud protections

Defending against fraud in the era of AI requires adaptive, predictive and collaborative technologies and approaches, like those used by…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events