Mitigating demographic bias in facial presentation attack detection
Categories
White Papers and Reports
Bias in AI-powered biometric algorithms can lead to unfair treatment of individuals in certain demographic groups based on their gender, age, and race.
As with matching algorithms, bias in liveness detection can also lead to higher error rates and less accessibility for mobile banking services for individuals based on their appearance.
This white paper by ID R&D introduces methodologies to address bias in support of Responsible AI principles.
Download The Report
Please fill out the following brief form in order to access and download this report.
Article Topics
AI | banking | biometric liveness detection | biometric-bias | biometrics | ethics | ID R&D | mobile banking | mobile biometrics | responsible AI | white paper
Comments