London police face questions after not deploying LFR at far-right rally

London police are facing questions on why they chose not to deploy live facial recognition (LFR) during last week’s far-right march organized by Tommy Robinson, after using the technology at the Notting Hill Carnival just weeks earlier. The questions arose just as the Metropolitan Police’s LFR system is being reviewed in court due to accusations of algorithmic bias against members of minorities.
The Met Police Commissioner said that there was no “intelligence basis” to deploy LFR during the Unite the Kingdom (UTK) rally, which saw more than 110,000 protesters against immigration marching through central London last Saturday.
Furthermore, the technology has not been used at any protests so far, Commissioner Sir Mark Rowley said at the London Policing Board on Tuesday.
“That’s one of the safeguards – that you’re not using it as a mass surveillance tool – you’re using it in particular places where there is intelligence-based to say this may actually make people safer,” he says.
The Unite the Kingdom rally resulted in clashes with the police, with 26 officers injured. The Met Police stated that its officers faced “unacceptable violence” with four sustaining serious injuries. The law enforcement agencies made 24 arrests following the rally, with 50 additional arrests expected.
Despite this, Rowley says that the police did not have grounds to deploy LFR since previous protests held by the organizers of the UTK rally “didn’t have any trouble of any significance,” the Southwark News reports.
The Notting Hill Carnival, on the other hand, has seen multiple violent incidents over the past years despite being a cultural event. During this year’s Afro-Caribbean celebration, 528 arrests were made, including 61 arrests following an identification using live facial recognition.
“That intelligence case has been built up over multiple years,” says the Commissioner. “There are dangerous people who are going to undermine this event for the good majority, and that needs tackling.”
Police deployments of LFR at the Notting Hill Carnival have been followed by concerns about algorithmic bias towards the Black community.
The Met Police is currently being sued by Shaun Thompson, an anti-knife crime community worker from London, who was detained by police officers after a facial recognition system produced a false match. The UK’s Equality and Human Rights Commission (EHRC) has recently announced it would provide evidence that the police’s use of LFR goes against the European Convention on Human Rights.
Thompson’s case is being supported by digital rights advocacy Big Brother Watch. The duo has recently secured a decision from the High Court to cap their cost liability at £70,000 (US$95,000).
Ireland continues to debate allowing facial recognition
The UK is not the only country debating law enforcement’s use of facial recognition.
In neighboring Ireland, the national police force, An Garda Síochána, has been advocating in favor of deploying the technology to solve crimes. The Department of Justice, however, is still working on drafting the Garda Síochána (Recording Devices) (Amendment) Bill, even though its draft was published in 2023.
The draft is set to be published during the coming Dáil term, the Department told the Irish Examiner. Meanwhile, the legislation has been examined by the Oireachtas justice committee, which recommended 32 amendments to the bill.
“It is clear […] that ethical and operational concerns remain in relation to the use of FRT,” says a committee’s briefing paper dated June 2025.
The law would allow Gardaí to use retrospective facial recognition under special circumstances, including retroactively reviewing CCTV footage of violent events. Solving crimes such as this has been one of the main arguments that the police have presented in their favor.
During a justice committee meeting in February 2024, former Garda Commissioner Drew Harris said that the police have been forced to manually sift through 22,000 hours of footage from anti-immigrant riots in Dublin that took place in November 2023.
The investigation designated a team of eight people to work around the clock on solving the case. However, the manual processing was “unfeasible and ineffective”, said Harris. Using AI tools, including those that record characteristics of individuals such as clothes and accessories, could help the police find a suspect more easily.
“What we are primarily talking about is the use of the technology to filter, cluster, or sift evidence, and to boil it down to a series of suggested cases at which the examiner would look,” then Garda Chief Information Officer Andrew O’Sullivan said at the time.
Committee members, on the other hand, have expressed concerns regarding processing the images of innocent people and misidentification. The Irish Council for Civil Liberties called the facial recognition technology an “intrusive, unreliable, and dangerous” form of surveillance and called on the police and the Department of Justice to clarify what facial image databases would be used and how the police plan to mitigate issues such as accuracy and discrimination.
Article Topics
biometrics | facial recognition | Ireland | London Metropolitan Police | police | real-time biometrics | UK





Comments