FB pixel

UK regulators pan patchwork policy for law enforcement facial recognition

Police, retail biometrics deployments raising questions still not easily answered
UK regulators pan patchwork policy for law enforcement facial recognition
 

The UK’s two Biometrics Commissioners shared cautionary observations about the use of facial recognition in law enforcement over the weekend in a report that raises several important questions about how police and retailers are using the technology.

The UK needs a Biometric Surveillance Act, Former Biometrics and Surveillance Camera Commissioner Fraser Sampson argued in a Biometric Update column last week. The current legal environment prioritizes flexibility in a way that undermines certainty, he says.

Prof. William Webster, commissioner for England and Wales, told The Guardian that as deployments increased over the past year, the “slow pace of legislation was trying to catch up with the real world.” As it lagged, “the horse had gone before the cart.”

Dr. Brian Plastow, commissioner for Scotland, says facial recognition is “nowhere near as effective as the police claim it is,” and characterized the law behind it as “patchwork.”

Sampson used the same term in his column to describe facial recognition deployments across the UK that vary from one local area to the next.

The claims of high accuracy? Plastow says police in England and Wales are “really just marking their own homework.”

Webster said earlier this year in his response to Home Office’s consultation on live facial recognition that the UK has a “once-in-a-generation opportunity to get a legal framework right” with its plan to legislate the horse back to the right side of the proverbial cart.

Similarity threshold confidence

Concerns about biometric accuracy and bias have led to policy recommendations, but in the absence of a legal standard, different police forces take different approaches to ensuring system performance.

A “Rapid response” post by UK Parliament on “Facial recognition technology in policing” dated April 8 notes that “An accuracy threshold of 0.6 is used in some police live facial recognition (LFR) systems. Any lower similarity scores are disregarded.”

Accuracy scores based on the identified threshold follow.

This framing raises the question of how many police forces are using lower similarity score thresholds. The question seems particularly pertinent given that the UK’s National Physical Laboratory (NPL) recommends an operational threshold of 0.64, which Home Office has adopted.

Police, however, have the option of lowering face-match thresholds without judicial oversights, as Labor MP Dawn Butler noted in late-2024.

The “Live Facial Recognition Annual Report 2025” produced by London’s Metropolitan Police indicates an attempt to follow the guidance.

“Independent testing by the National Physical Laboratory found that at the thresholds the MPS work at (minimum of 0.6, 0.64 from July 2024 to present) the system is accurate and balanced with regard to ethnicity and gender.”

A policy document for South Wales Police notes that the force uses a threshold of “0.64 with the current FRT algorithm used by SWP,” and can only be lowered “with a full rationale” in an application.

Essex Police has used a 55 similarity score threshold since August of 2024, and provide several paragraphs in explanation of its decision in a frequently asked questions post. The force commissioned two independent studies to look into the risk of algorithmic bias, which returned a split verdict. Essex Police told Biometric Update it paused its deployment to work with its software supplier, Corsight AI, on updates to the software, as well as its policies and procedures.

The impact of using a lower similarity threshold is therefore another question the situation raises. Notably, the UK Parliament post cites an error rate increase of over 9 percent for LFR “outside a testing environment,” but does so with reference to NIST FRVT results from 2020 comparing match rates from mugshots to those of photos captured “in the wild.” NIST biometrics evaluations are carried out in a testing environment, and the reference to FRVT us so outdated the program now goes by a different name.

Specific quantities and vague outcomes

The barriers to understanding what the numbers mean don’t just apply to figures supplied by UK police.

Big Brother Watch tells The Guardian it has now heard from 21 people who were wrongly placed on watchlists for retail facial recognition systems or matched with someone else on them.

A figure was not provided for how many misidentifications resulted in police or retail staff detaining the individual, and outcomes can vary quite widely.

For example, The Guardian reported separately on Monday that during a recent pilot in Croydon, live facial recognition scanned thousands of faces in public, and sent 19 alerts to police. Nine of the alerts led to arrests, and two other people were stopped and questioned by police.

The weekend Guardian article also quotes a former security guard who worked at a shop that used Facewatch cameras. He says that not only can retail staff “maliciously” add individuals to the watchlist of shoplifters and criminals, but he knows of 10 to 15 examples of it happening.

Sampson, who is now non-executive director at Facewatch, tells Biometric Update “the Facewatch system is designed in such a way as to prevent misuse and has very strict rules governing its use with inbuilt safeguards and controls. Facewatch sends more than 500,000 alerts to retailers each year and decisions to monitor or remove anyone as a result of an alert are made by staff using their own policies and procedures.”

Related Posts

Article Topics

 |   |   |   |   |   | 

Latest Biometrics News

 

Deepfake detection upgrade for Sumsub highlights continuous self-improvement

Sumsub has launched an upgrade to its deepfake detection product with instant online self-learning updates to address rapidly evolving fraud…

 

Metalenz debuts under-display camera for payment-grade face authentication

Unlocking a smartphone with your face used to require a camera placed in a notch or a punch hole in…

 

UK gov’t seeks covert surveillance tech in benefit fraud crackdown

The UK Department for Work and Pensions (DWP) has published a £2 million (US$2.7 million) tender seeking software and hardware…

 

Biometrics in warfare, surveillance raise new oversight challenges

A new Congressional Research Service (CRS) report warns that biometric technologies are moving from routine identity verification into more consequential…

 

Harvard, Linux Foundation launch open-source wallet for selective data sharing

The internet is seeing a wide-scale push towards identity verification and age assurance, but the question remains: how can users…

 

Facephi graduates from startup phase with positive 2025 net, EBITDA surge

Revenue from Facephi’s core biometrics and its newer digital identity and fraud prevention portfolio grew by 24.6 percent in 2025,…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events