London testing facial recognition app for police as another false match surfaces

London police will soon start testing “operator-initiated facial recognition” (OIFR), a mobile app that allows officers to photograph a person and check their identity. At the same time, more cases of misidentification by police facial recognition systems are emerging in the UK.
On Thursday, London Mayor Sadiq Khan shared that the OIFR system, powered by NEC Neoface software, will be piloted by 100 Metropolitan Police officers for six months. The mobile biometric matching technology will be used during police stops.
One of the advantages of the technology is that police will not have to arrest a person and take them to the station for identification, says Mayor Khan, according to the Guardian.
OIFR has been in use in South Wales since 2024, with police officers relying on the NEC’s NeoFace algorithms to identify people. NEC’s facial recognition algorithms have consistently placed among the most accurate tested in NIST’s Face Recognition Vendor Evaluation (FRTE) for identification (1:N).
The London deployment was first announced in December by the Met Police’s Lindsey Chiswick, who explained how the technology would be used.
“If an individual has their photo taken and there is no match, then their biometric information will be deleted straight away,” said Chiswick, who is also the National Police Chiefs’ Council’s (NPCC) lead for Facial Recognition.
The arrival of OIFR to London, however, has sparked rebukes from rights group Big Brother Watch and lawmakers such as Green Party London Assembly member Zoë Garbett, who has previously called for bans on police use of facial recognition in the city.
Both argue that the use of the technology, which is also used by ICE’s Mobile Fortify app in its immigration crackdown in the U.S., introduces bias against minorities.
A recent false identification incident may support their arguments.
South Asian man sues police after false positive with outdated software
In January, Thames Valley Police arrested a man for burglary after a facial recognition system confused him with another person. Alvi Choudhury, a 26-year-old software engineer, was taken away from his home in Southampton after a facial match with video footage of a suspect involved in a burglary in a town 100 miles away.
Both the suspect and Choudhury are of South Asian descent, but the burglar in the CCTV footage was younger.
“I was very angry, because the kid looked about 10 years younger than me,” says Choudhury, who also says that the suspect’s facial features were different from his. “I just assumed that the investigative officer saw that I was a brown person with curly hair and decided to arrest me.”
The software engineer told the media that the police have admitted that the arrest “may have been the result of bias within facial recognition technology.”
In December, a government assessment of Cognitec’s software revealed that the police have been using outdated algorithms from 2020, which have been shown to produce a higher rate of false positives for black and Asian faces than white faces at certain settings.
The police officer also told Choudhury that “facial recognition is already “subject to review at a strategic level,” and that the issue will not be raised “as part of wider organisational learning.” The UK Home Office has been deliberating on the technology as part of consultations on a legal framework, which kicked off in December.
Choudhury has decided to seek damages from Thames Valley police and the Hampshire Constabulary for arresting him and holding him for 10 hours.
The police force denies that the arrest was unlawful: The facial recognition system provided the intelligence, but did not determine the arrest, they say.
“While we apologise for the distress caused to the complainant in this case, their arrest was based on the investigating officers’ own visual assessment that the individual matched the suspect in CCTV footage following a retrospective facial recognition match, and was not influenced by racial profiling,” a Thames Valley police spokesperson says.
Choudhury’s case has also attracted the attention of the Equality and Human Rights Commission (EHRC), with its Chair, Mary-Ann Stephenson, warning of racial disparities in false-positive identification and calling for clear use of facial recognition.
“We need clear rules to guarantee that facial recognition technology is used only where necessary, proportionate and constrained by appropriate safeguards,” says Stephenson. “To ensure the new framework is being followed correctly, a new independent body should be established with appropriate enforcement and oversight powers to ensure compliance.”
The EHRC has also agreed to provide submissions in another case of false matching, this time involving the London police.
In 2024, Big Brother Watch mounted a legal challenge to the Met Police’s use of facial recognition after its system produced a false match, which led to the detainment of a black anti-knife crime campaigner, Shaun Thompson. The case, brought by Thompson and Big Brother Watch director Silkie Carlo, was heard by the London High Court in January.
Article Topics
biometric matching | facial recognition | false arrest | law enforcement | London Metropolitan Police | mobile app | NEC | NeoFace | Operator Initiated Facial Recognition (OIFR) | real-time biometrics





Comments