UK oversight bodies seek clarity on facial recognition bias, cooperation on governance

The fallout from a report on the demographic differentials of the facial recognition algorithms UK police use, tucked into an announcement about the technology’s use expanding, has been as charged as it was predictable.
The UK’s National Physical Laboratory found that a retrospective face biometrics algorithm supplied by Cognitec has statistically significant differences in error rates between white subjects and Black or Asian people, as well as between genders and age groups.
It was the first the Information Commissioner’s Office had heard of the differences, according to a statement in response.
“We acknowledge that measures are being taken to address this bias,” ICO Deputy Commissioner Emily Keane is quoted as saying. “However, it’s disappointing that we had not previously been told about this, despite regular engagement with the Home Office and police bodies as part of our wider work to hold government and the public sector to account on how data is being used in their services.”
The ICO has “asked Home Office for urgent clarity on this matter so we can assess the situation and consider our next steps.”
‘Policing cannot be left to mark its own homework’
The Association of Police and Crime Commissioners (APCC) issued a statement said the report “shed light on a concerning built-in bias.”
The statement, attributed to APCC Lead on Forensics Darryl Preston (Police and Crime Commissioner for Cambridge), APCC Joint Lead on Race Disparity, Equality and Human Rights Alison Lowe (Deputy Mayor for Policing and Crime for West Yorkshire) and APCC Joint Leads on Performance, Data and New Tech Ethics John Tizard (PCC for Bedfordshire) and Chris Nelson (PCC for Gloucestershire).
“The language is technical but, behind the detail, it seems clear that technology has been deployed into operational policing without adequate safeguards in place,” they write.
The Commissioners say that system failures were known about but not communicated to affected communities or other stakeholders, and emphasize that “governance and accountability on behalf of communities is not a ‘nice to have.’” Transparency and oversight are necessary for public trust.
“We call on policing and the Government to acknowledge the errors made and to work with those responsible for policing governance, locally and nationally, to ensure that scrutiny and transparency are at the heart of the police reform agenda and the forthcoming White Paper. Policing cannot be left to mark its own homework.”
NPL found that Idemia’s biometric algorithm used by Home Office does not have any differences in performance between demographic groups. The NEC algorithm used for live facial recognition by UK police was tested by the NPL in 2023, and Met Police claimed the study found no statistically significant bias, though that analysis has been disputed.
Article Topics
biometric bias | biometric matching | biometrics | facial recognition | Information Commissioner’s Office (ICO) | law enforcement | live facial recognition | National Physical Laboratory | police | UK







Comments