Biometrics can be biased but how biased? Researchers propose a ruler
Researchers in the United States say they have created a statistical method for “rigorous statistical comparison” for spotting differences among demographic groups. Their math would be applicable to fighting bias in biometric algorithms.
Having one or more standard ways to measure bias would be a boon for many AI sectors, but particularly facial recognition, where biases are known to lurk.
The researchers are from Clarkson University and St. Lawrence University, and include prominent figures in biometrics like Michael Schuckers, Charles A. Dana Professor of Statistics and Data Science and Stephanie Schuckers, a recent panelist on Biometric Update’s webinar on responsible and ethical commercial biometrics. It was supported by Clarkson’s Center for Identification Technology Research, better known as CITeR. A National Science Foundation grant helped pay for the work.
The code they wrote looks for “statistically distinguishable” rates of false non-matches, or FNMR.
“We think of fairness as meaning that the FNMR’s are not statistically different across one or more demographic categories,” the team wrote in their paper.
The method involves bootstrapping individuals across the different demographic groups and measure the variation of error rates in each, and then building “a distribution of the maximal variation for the overall error rate.”
The result, they say, is that they can “determine which, if any, groups have error rates that differ from the rest.”
The method can make simultaneous comparisons in racial, education and age demographics in situation where individual are “classified into one category within each of those groups.”
Despite no small amount of consternation in the industry and among the public about biased algorithms, the team said, there has been “little work in this area incorporating statistical variation.”
Article Topics
biometric-bias | biometrics | biometrics research | CITeR | Clarkson University | demographic fairness | facial recognition | Stephanie Schuckers
Comments