FB pixel

Mitigating demographic bias in facial presentation attack detection

Mitigating demographic bias in facial presentation attack detection
 

Bias in AI-powered biometric algorithms can lead to unfair treatment of individuals in certain demographic groups based on their gender, age, and race.

As with matching algorithms, bias in liveness detection can also lead to higher error rates and less accessibility for mobile banking services for individuals based on their appearance.

This white paper by ID R&D introduces methodologies to address bias in support of Responsible AI principles.

Related Posts

Download The Report

Please fill out the following brief form in order to access and download this report.

Article Topics

 |   |   |   |   |   |   |   |   |   | 

Latest Biometrics News

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

The ID16.9 Podcast

Most Read This Week

Featured Company

Biometrics Research

Biometrics White Papers

Biometrics Events

Explaining Biometrics