UK MPs examine Met police use of facial recognition
The facial recognition system used by the UK’s largest police force is “very accurate” but when live facial recognition is used at lower face-match thresholds, a slight difference in performance between some demographics was observed, testing by the UK National Physical Laboratory (NPL) has shown.
“We find that, if the system is run at low and easy thresholds, the system starts showing a bias against Black males and females combined,” said Tony Mansfield, principal research scientist at NPL. However, Mansfield added that he believes that the police did not operate the system at these thresholds.
Mansfield delivered the results of NPL’s testing on Tuesday to the Science, Innovation and Technology Committee alongside Lindsey Chiswick, director of Intelligence at Met Police as part of an inquiry on the Governance of artificial intelligence.
The NEC system, used by the Metropolitan Police Service for policing, has become a subject of controversy with rights groups raising concerns about false positives and discrimination.
Research on the NEC Neoface V4 1 using HD5 Face Detector started in 2022 for the Metropolitan Police and the South Wales Police with the laboratory probing three policing use cases: Live Facial Recognition (LFR), Retrospective Facial Recognition (RFR) and Operator Initiated Facial Recognition (OIFR).
In April, the NPL delivered its first set of results finding “substantial improvement” in its accuracy. The Met Police said at the time that the NPL’s study of NEC’s system has given them the confidence to resume facial recognition surveillance after it was paused.
Right’s groups such as Big Brother Watch, however, have argued that the accuracy rate is not high enough and noted that the report did not include ethnicity breakdowns.
The newest NPL report is meant to provide more insight into how ethnicity affects facial recognition. The test showed that false positive identifications increase at lower face-match thresholds (0.58 and 0.56) and start showing a “statistically significant imbalance between demographics with more Black subjects having a false positive than Asian or White subjects.”
“The Asian females were the best performing—had the best recognition rate of the demographics—and the black females had the lowest. But those were really quite close.”
The system, however, showed no bias when applied to a 178,000 Filler image watchlist at higher thresholds (0.64 or higher) with no false positives. Results from medium threshold testing (0.60 to 0.62) brought up eight false positives but no statistically significant bias in demographics, according to the report.
Michael Birtwistle from the Ada Lovelace Institute, however, noted during the session held by the Committee that the tests conducted by NPL were “a snapshot of a single version of a single system” which cannot be generalized.
“We think that without a proper regulatory framework, there isn’t a guarantee that facial recognition systems deployed by police will meet reasonable standards of accuracy or that their use will remain proportionate to the risks presented by them,” says Birtwistle.
Marion Oswald, senior research associate at the Alan Turing Institute said that there is definitely potential in these tools even though their use is currently limited while police units use different data formats and databases which are not connected.
“That in itself creates a risk,” she says.
Oswald also added that police need to be aware of issues such as ethnic disparities when deploying facial recognition technologies. Errors will occur at lower thresholds than the ones recommended: At the setting of 0.56, among the 33 people that were falsely identified, 22 were black, she says.
“It’s very difficult when you’re presented with a match to decide what to do with it unless you really understand the settings of the system, what the uncertainties might be, what the risks are for errors and biases in the reports that the NPL issued,” says Oswald.
Since the research was published, the NEC system has been used three times, including for alerts and zero false positives, according to the Met Police’s Chiswick.
Article Topics
accuracy | biometric matching | biometric-bias | facial recognition | London Metropolitan Police | National Physical Laboratory | NEC | NeoFace
Comments