FB pixel

UK MPs examine Met police use of facial recognition

Ethnicity, thresholds, biometric matching discussed
 

The facial recognition system used by the UK’s largest police force is “very accurate”  but when live facial recognition is used at lower face-match thresholds, a slight difference in performance between some demographics was observed, testing by the UK National Physical Laboratory (NPL) has shown.

“We find that, if the system is run at low and easy thresholds, the system starts showing a bias against Black males and females combined,” said Tony Mansfield, principal research scientist at NPL. However, Mansfield added that he believes that the police did not operate the system at these thresholds.

Mansfield delivered the results of NPL’s testing on Tuesday to the Science, Innovation and Technology Committee alongside Lindsey Chiswick, director of Intelligence at Met Police as part of an inquiry on the Governance of artificial intelligence.

The NEC system, used by the Metropolitan Police Service for policing, has become a subject of controversy with rights groups raising concerns about false positives and discrimination.

Research on the NEC Neoface V4 1 using HD5 Face Detector started in 2022 for the Metropolitan Police and the South Wales Police with the laboratory probing three policing use cases: Live Facial Recognition (LFR), Retrospective Facial Recognition (RFR) and Operator Initiated Facial Recognition (OIFR).

In April, the NPL delivered its first set of results finding “substantial improvement” in its accuracy. The Met Police said at the time that the NPL’s study of NEC’s system has given them the confidence to resume facial recognition surveillance after it was paused.

Right’s groups such as Big Brother Watch, however, have argued that the accuracy rate is not high enough and noted that the report did not include ethnicity breakdowns.

The newest NPL report is meant to provide more insight into how ethnicity affects facial recognition. The test showed that false positive identifications increase at lower face-match thresholds (0.58 and 0.56) and start showing a “statistically significant imbalance between demographics with more Black subjects having a false positive than Asian or White subjects.”

“The Asian females were the best performing—had the best recognition rate of the demographics—and the black females had the lowest. But those were really quite close.”

The system, however, showed no bias when applied to a 178,000 Filler image watchlist at higher thresholds (0.64 or higher) with no false positives. Results from medium threshold testing (0.60 to 0.62) brought up eight false positives but no statistically significant bias in demographics, according to the report.

Michael Birtwistle from the Ada Lovelace Institute, however, noted during the session held by the Committee that the tests conducted by NPL were “a snapshot of a single version of a single system” which cannot be generalized.

“We think that without a proper regulatory framework, there isn’t a guarantee that facial recognition systems deployed by police will meet reasonable standards of accuracy or that their use will remain proportionate to the risks presented by them,” says Birtwistle.

Marion Oswald, senior research associate at the Alan Turing Institute said that there is definitely potential in these tools even though their use is currently limited while police units use different data formats and databases which are not connected.

“That in itself creates a risk,” she says.

Oswald also added that police need to be aware of issues such as ethnic disparities when deploying facial recognition technologies. Errors will occur at lower thresholds than the ones recommended: At the setting of 0.56, among the 33 people that were falsely identified, 22 were black, she says.

“It’s very difficult when you’re presented with a match to decide what to do with it unless you really understand the settings of the system, what the uncertainties might be, what the risks are for errors and biases in the reports that the NPL issued,” says Oswald.

Since the research was published, the NEC system has been used three times, including for alerts and zero false positives, according to the Met Police’s Chiswick.

Article Topics

 |   |   |   |   |   |   | 

Latest Biometrics News

 

London to introduce permanent live facial recognition cameras

London police have announced their plans to install the UK’s first permanent live facial recognition cameras, catching potential criminals by…

 

UK govt not giving up on Voter ID for 2025 local elections

Removing voter ID from the UK’s elections is not on the table, Minister for Homelessness and Democracy Rushanara Ali confirmed…

 

China strengthening face biometrics regulation to mandate choice, consent

China’s boom in selfie biometrics and facial recognition may already have peaked, with new regulations published so businesses can plan…

 

Intellicheck, Raonsecure invest in new IDV markets for steady growth

Market and investment strategy loom over the latest set of financial results from digital identity and biometrics providers. Intellicheck credits…

 

Facial recognition tender for Toronto police draws interest from major vendors

Eleven biometrics providers, including large international firms, are vying to provide Toronto police with a new facial recognition system, which…

 

OBIM spec enables vendors to build products to interact with DHS biometric system

The U.S. Department of Homeland Security (DHS) has opened its specification for interacting with the nation’s largest biometrics database to…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events