FB pixel

Evaluations of Corsight live facial recognition follow Essex Police pause

ICO, National Physical Laboratory, Cambridge reports address accuracy and bias concerns
Evaluations of Corsight live facial recognition follow Essex Police pause
 

Essex Police paused its live facial recognition (LFR) deployments after identifying potential accuracy and bias risks, an audit published this week by the UK Information Commissioner’s Office has revealed. The audit is one of several new reports, including an assessment of Corsight’s LFR by the National Physical Laboratory and an evaluation of Essex Police’s use of the technology from Cambridge University, that could pave the way for the program’s relaunch.

Last year, the police force published an impact assessment report saying that its Corsight AI and Digital Barriers LFR produced just one incorrect alert from a false match from more than 383,000 match attempts. The report, however, was criticized by digital rights group Big Brother Watch and others, who claimed that the biometric assessments featured inconsistencies and poor methodology.

The campaign group pointed out that Essex Police said it would configure LFR’s threshold “at 0.6 or above” to achieve equitable false positive rates across demographics. However, this figure comes from the National Physical Laboratory’s testing of NEC‘s Neoface V4 algorithm — used by the Metropolitan Police and South Wales Police — not the system Essex Police actually deploys.

A web page of information Essex Police provide on their live facial recognition policy includes an accuracy and equitability evaluation published this month by the UK’s National Physical Laboratory (NPL) for Corsight’s Apollo 4 software. The evaluation, carried out based on the  ISO/IEC 19795 standard with the same methodology and dataset as its earlier test of NEC’s facial recognition, shows a True Positive Identification Rate (TPIR) of 89 percent with a False Positive Identification Rate (FPIR) of 0.017 percent, or 1 in 5,700 matches, with a watchlist of 18,000 images. With a reference biometric database of 1,800, TPIR remained the same, while FPIR improved to 0.002 percent (1 in 57,000).

Gender, ethnicity and combined demographic factors were found to not be statistically significant at the 0.05 significance level. TPIRs for different demographic groups at a face match threshold of 55 varied from 94 percent on the high side for Black males to 86 percent for white males.

A list of live facial recognition deployments provided by Essex Police shows a 77th and last entry on August 26, so deployments were paused when the ICO carried out its audit last November.

The pause in LFR deployment is not directly related to criticism of the impact assessment report, but rather to its early findings, according to Essex Police.

In its audit, the ICO says that the accuracy and bias risks have been escalated and monitored through the Facial Recognition Working Group and Strategic Board.

The audit focuses on the data protection aspect of LFR use, and concludes that Essex Police provides a reasonable level of assurance related to data protection compliance in regard to LFR and a high level of assurance in regard to retrospective facial recognition (RFR).

In the meantime, the police force published a report last week that combined findings from three studies conducted by the University of Cambridge. The research was carried out between January and June 2025 and examines the accuracy of the technology used by Essex police, who appears on watchlists, and whether LFR deployments affect crime levels.

The report shows that Essex Police ran an experiment using paid actors to test how the system’s True Positive Identification Rate (TPIR) corresponds to different identification thresholds.

The findings show that at the current operational threshold of 55, the LFR system correctly identified 50.7 percent of people on the watchlist, while incorrect identifications were extremely rare. The system was also more likely to correctly identify men and Black participants than women and participants from other ethnic groups.

“In line with our commitment to our Public Sector Equality Duty, Essex Police commissioned two independent studies which were completed by academia,” an Essex Police representative told Biometric Update in an emailed statement.

“The first of these indicated there was a potential bias in the positive identification rate, while the second suggested there was no statistical relevant bias in the results.

“Based on the fact there was potential bias the force decided to pause deployments while we worked with the algorithm software provider to review the results and seek to update the software.

“We then sought further academic assessment.

“As a result of this work we have revised our policies and procedures and are now confident that we can start deploying this important technology as part of policing operations to trace and arrest wanted criminals.

“We will continue to monitor all results to ensure there is no risk of bias against any one section of the community.”

ICO audits of police facial recognition continue

Aside from auditing facial recognition use by the Essex Police, the ICO has also published a similar evaluation for Leicestershire Police.

The audits are a regular part of the ICO’s AI and biometrics strategy and cover police forces across England and Wales. The agency has previously published results from South Wales and Gwent police.

For Leicestershire Police, the ICO reviewed its use of retrospective facial recognition (RFR) and concluded that there is a reasonable level of assurance regarding data protection compliance. The police force began trialing the technology in 2015, but despite over 10 years of use, it has not deployed live facial recognition (LFR).

The force received recommendations to ensure appropriate technical training for staff in data protection and privacy roles and to document the use of the Police National Database (PND) for retrospective facial recognition within the FRT policy.

“What’s clear from this work so far is that robust data protection must sit at the heart of all police use of FRT,” says Emily Keaney, deputy commissioner for Regulatory Policy. “Forces may use the technology in different ways, but all must fully understand the systems they rely on and anchor their approach in strong governance.”

The ICO also recommends that all police forces conduct routine testing for bias and discriminatory outcomes, whether arising from technology design, training data, or watchlist composition.

In December, the Home Office identified historic bias within the algorithm used for RFR searches on the Police National Database. The ICO is examining the issue, according to Keaney.

The UK has been expanding deployments of live facial recognition across the country. After London’s Metropolitan Police, South Wales Police and Essex Police, the technology is being implemented in Greater Manchester, West Yorkshire, Bedfordshire, Surrey, Sussex, Thames Valley and Hampshire. The Home Office plans to finance 10 new LFR vans over the next five years.

This post was updated at 1:00pm Eastern on March 20, 2026 to include comment from Essex Police.

Related Posts

Article Topics

 |   |   |   |   |   |   |   | 

Latest Biometrics News

 

Biometric Update Podcast explores identification at scale using browser fingerprinting

“Browser fingerprinting is this idea that modern browsers are so complex.” So says Valentin Vasilyev, Chief Technology Officer of Fingerprint,…

 

Passkeys now pervasive but passwords persist in enterprise authentication

Passkeys are here; now about those passwords. Specifically, passkeys are now prevalent in the enterprise, the FIDO Alliance says, with…

 

Pornhub returns to UK, but only for iOS users who verify age with Apple

In the UK, “wanker” is not typically a term of endearment. However, the case may be different for Pornhub, which…

 

Europol operated ‘shadow’ IT systems without data safeguards: Report

Europol has operated secret data analysis platforms containing large amounts of personal information, such as identity documents, without the security…

 

EU pushes AI Act deadlines for high-risk systems, including biometrics

The EU has reached a provisional agreement on changes to the AI Act that postpone rules on high-risk AI systems,…

 

Meta challenges UK Online Safety Act fines tied to global revenue

Lo and behold: Meta does not want to pay the fines UK regulator Ofcom says are owed to it for…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events