FB pixel

UK knowingly deploys biased facial recognition passport checking system

UK knowingly deploys biased facial recognition passport checking system
 

The UK has admitted to introducing in 2016 a biometric facial recognition passport photo checker, although it was aware the system would have difficulties recognizing passengers with very light or dark skin tones, writes New Scientist.

“User research was carried out with a wide range of ethnic groups and did identify that people with very light or very dark skin found it difficult to provide an acceptable passport photograph,” UK Home Affairs wrote in a document released following a freedom of information (FOI) request. “However; the overall performance was judged sufficient to deploy.”

According to New Scientist, two black people made public the system errors they had encountered.

“A person’s race should not be a barrier to using technology for essential public services,” says a spokesperson for the UK’s Equality and Human Rights Commission. “We are disappointed that the government is proceeding with the implementation of this technology despite evidence that it is more difficult for some people to use it based on the color of their skin.”

Algorithmic bias of facial recognition technology has long been discussed. In July, the UK Information Commissioner’s Office (ICO) chose 10 out of 64 projects, one focused on identifying and mitigating algorithmic bias in machine learning models used for remote biometric identity verification. In July, an artist built for the European Commission’s SHERPA project on ethics in machine learning an intelligent water gun that uses facial recognition to identify targets and show how algorithmic bias can lead to discrimination and unfair treatment.

Although certain facial recognition systems have failed in detecting people with darker skin tones, the failure rates for facial recognition systems can vary widely.

The Home Office claims passengers can simply move forward with passport application and override the selection, but this is not that simple as the system warns users that issues could arise with the application if the photo is not according to requirements.

The Home Office said it would “continue to conduct user research and usability testing with appropriate participants to ensure that users from different ethnicities can follow the photo guidance and provide a photo that passes the photo checks.”

University of Sheffield Computer Scientist Noel Sharkey told New Scientist that the affair demonstrates the need for new regulation of facial recognition.

Earlier this year, a U.S. House subcommittee discussed the use of facial recognition technology, how biased algorithms can lead to discrimination, and the lack of diversity at tech companies. There was also a focus on how facial recognition success rates differ between white men and women of color. IBM has recently launched software to analyze how and why algorithms make decisions, as well as detect bias and recommend changes. For the past year, MIT researchers have also been working on an automated tool to ‘de-bias’ AI training data.

Article Topics

 |   |   |   |   |   |   | 

Latest Biometrics News

 

Face biometrics use cases outnumbered only by important considerations

With face biometrics now used regularly in many different sectors and areas of life, stakeholders are asking questions about a…

 

Biometric Update Podcast explores identification at scale using browser fingerprinting

“Browser fingerprinting is this idea that modern browsers are so complex.” So says Valentin Vasilyev, Chief Technology Officer of Fingerprint,…

 

Passkeys now pervasive but passwords persist in enterprise authentication

Passkeys are here; now about those passwords. Specifically, passkeys are now prevalent in the enterprise, the FIDO Alliance says, with…

 

Pornhub returns to UK, but only for iOS users who verify age with Apple

In the UK, “wanker” is not typically a term of endearment. However, the case may be different for Pornhub, which…

 

Europol operated ‘shadow’ IT systems without data safeguards: Report

Europol has operated secret data analysis platforms containing large amounts of personal information, such as identity documents, without the security…

 

EU pushes AI Act deadlines for high-risk systems, including biometrics

The EU has reached a provisional agreement on changes to the AI Act that postpone rules on high-risk AI systems,…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events