A year later and UK passport biometric face scan system still favors white males
Almost a year to the day after media stories questioned bias in UK biometric face scanning systems, it appears little has changed.
A facial recognition photo checker deployed by the UK government for use in grading submitted passport pictures continues to reject more images of women with darker skin tones than anyone else.
The biometric algorithm is designed to accept or reject submitted images based on the UK’s quality and pose rules. Among other things, people must close their mouths, look at the camera and assume a neutral expression.
A BBC story quotes two women who took their own photos according to the regulations and submitted them only to have the software say they were of poor quality. The photo-checking app replied to one of the women with the following message: “It looks like your mouth is open,” when that plainly is not the case.
The BBC quoted a government spokeswoman saying that 9 million people have used the system, which is improving.
The publisher submitted more than 1,000 photographs of politicians around the globe to test out the biometric algorithm.
Twenty-two percent of dark skin-toned women’s photos were judged poor quality. Light-toned women’s images failed 14 percent of the time
Dark-skinned men’s photos failed 15 percent of the time; light-skinned men’s pictures failed nine percent of the time, according to the BBC.
Last October, the British government was forced to admit that the facial recognition photo checker had been deployed in 2016 despite the algorithm’s difficulty reading images of non-white males.
It was decided that its quality was good enough to deploy.
accuracy | algorithms | biometric identification | biometric passport | biometrics | face photo | facial recognition | UK