Mitigating demographic bias in facial presentation attack detection
Bias in AI-powered biometric algorithms can lead to unfair treatment of individuals in certain demographic groups based on their gender, age, and race.
As with matching algorithms, bias in liveness detection can also lead to higher error rates and less accessibility for mobile banking services for individuals based on their appearance.
This white paper by ID R&D introduces methodologies to address bias in support of Responsible AI principles.
Download White Paper
Please fill out the following brief form in order to access and download this white paper.