Chincotech tackles racial bias in facial recognition systems
Tokyo-based software company Chincotech has announced the development of a multi-racial facial recognition system to provide superior accuracy than traditional systems, which often have unacceptably high error rates for non-white individuals.
In tests of facial recognition systems by M.I.T. Media Lab Researcher Joy Buolamwini, gender was misidentified for less than 1 percent of lighter-skinned males, and up to 7 percent of lighter-skinned females, The New York Times reports. The same systems misidentified the gender of up to 12 percent of darker-skinned males, and a shocking 35 percent of darker-skinned females.
The datasets used to test facial recognition systems may be contributing to the problem, as one widely-used collection of images is estimated to be more than 75 percent male and more than 80 percent white. Haverford College computer scientist Sorelle Friedler, a reviewing editor on Buolamwini’s research paper (PDF), said that experts have suspected that the performance of facial recognition systems depends on the population being considered, and that the research is the first to empirically confirm the suspicion.
The paper, written by Buolamwini and Microsoft researcher Timnit Gebru, studied facial analysis systems from Microsoft, IBM, and Megvii.
Chincotech is combatting this challenge with a 3D transforming face algorithm that continuously learns multi-racial characteristics to accurately identify people in 2D pictures.
“Our tests have proved that this technique coupled with a system that is taught to learn the difference between races and you have a system that delivers significantly more accurate results,” said Chincotech Head Software Development Engineer Paul Rashford.
Buolamwini has given a TED Talk on coded bias, and advocates for algorithmic accountability as a founder of the Algorithmic Justice League.
As previously reported, University of Surrey researchers developed a multi-racial facial recognition system last year which delivers more accurate results than are typical.
This post was updated at 9:22am on July 27, 2021, to clarify that the Gender Shades study tests facial analysis algorithms, not identification algorithms.
Article Topics
accuracy | biometric-bias | biometrics | Chincotech | facial recognition | Joy Buolamwini
Comments