Idiap researchers propose biometric fairness metrics to decouple bias from accuracy
Several new measures for quantifying demographic differentials, or bias, in biometric identity verification systems are suggested in a paper which has been accepted for publication. The new metrics represent an effort to move beyond whether a sample has been matched, to include consideration of how well, by considering matching scores.
The paper ‘Fairness Index Measures to Evaluate Bias in Biometric Recognition’ was authored by Sébastien Marcel and Ketan Kotwal of the Idiap Research Institute. It has been accepted by the International Conference on Pattern Recognition Workshops.
While biometric bias is often associated with facial recognition, the researches say their metrics are agnostic to the modality used.
The recently-proposed Fairness Discrepancy Rate (FDR) is considered, along with the use of “the ROC (Receiver Operating Characteristic) curve as a proxy to measure demographic differentials.”
“While few, existing fairness measures are based on post-decision data (such as verification accuracy) of biometric systems, we discuss how pre-decision data (score distributions) provide useful insights towards demographic fairness,” the paper’s authors write in the abstract of their paper.
Marcel and Kotwal propose methods based on weighted fusion of results for each of the three measures, and three variants for each measure to allow assessment from multiple perspectives.
Separation Fairness Index (SFI) measures how close genuine and imposter-matching scores for different demographic groups depart from expected values. Compactness Fairness Index (CFI) measures the spread of scores across different groups. Distribution Fairness Index (DFI) measures equitability towards overall score distributions. In each case, similar scores and therefore fair systems are indicated by closeness to a value of 1.0.
If successful, these metrics indicate how fair a biometric verification is separately from its accuracy, but are supposed to compliment, not replace, outcome-based fairness measures.
NIST is also working on how to measure biometric bias, and currently seeking feedback on the state of the art.