Biometrics experts discuss facial recognition research on demographic effects and bias
Many if not all commercial facial recognition algorithms seem to share common roots that make them susceptible to demographic performance differentials, or bias, but also suggest a common fix, John Howard told an audience during the International Face Performance Conference (IFPC) 2020. The conference is hosted by the National Institute of Standards and Technology (NIST) and DHS’ Science and Technology Directorate (S&T) along with the European Association of Biometrics (EAB) and the UK’s National Physical Laboratory.
Howard presented the research, with Yevgeniy Sirotin also participating in the conversation in one of eleven sessions held on day one of IFPC 2020.
The testing was conducted by John Howard, Yevgeniy Sirotin and Jerry Tipton of MdTF and Arun Vemury of S&T.
The method of testing and the mathematics that show demographic clustering were briefly explained, with the presentation also touching on the implications of the research and the conclusions that can be drawn at a high level.
By selectively removing principal components that showed high degrees of clustering and then reconstructing the data, the distributions between the groups of volunteers with the same gender and race and different gender and race, as well as mated pairs, were reduced, though not dramatically.
Communicating the undesirability of “broad homogeneity” in facial recognition algorithms, and the consequences of its inclusion in commercial systems used against large galleries, is important to the facial recognition industry, which is playing catchup to privacy advocates with scientific research into the current technology’s limitations. Howard said that he believes the characteristic is present in all commercial facial recognition systems, and invited conference attendees to inform him of any they believe do not posses it.
The research suggests that the “Daugman” algorithm commonly used in iris recognition systems does not have broad homogeneity.
“Identifying, isolating, and then removing” those features which yield results with broad homogeneity could be the way to reduce bias in face biometrics, Howard concludes.
One audience member asked about the place of the research relative to other studies, and another asked about the selection of an algorithm from among several depending on the demographics of the probe image as a possible alternative approach.
Another audience member asked whether eye colour is a feature used in biometric algorithms, with session chair Tony Mansfield answering that the Daugman algorithm typically does not consider eye colour.
The research will continue, according to Howard and Sirotin, with several different areas of interest under consideration for future tests.
IFPC continues of Wednesday and concludes on Thursday, October 29.