As London expands CCTV network, critics express concerns of overreach

London is making a significant upgrade of its CCTV infrastructure in one of its most surveilled boroughs, adding new cameras with live facial recognition (LFR) and upgrading existing cameras with AI capabilities.
The western borough of Hammersmith and Fulham already has 2,500 cameras, more cameras per person than anywhere else in the UK. The local council has now approved a budget of more than £3.2 million (US$4.3 million) aimed at upgrading the network over the next three years.
“This investment is about giving families peace of mind, helping victims see justice done, and ensuring criminals know there’s nowhere to hide in H&F,” says the borough’s Council leader, Stephen Cowan.
The upgrade builds on the borough’s existing £5.4 million ($7.2 million) investment into CCTVs and is set to deliver “the most sophisticated surveillance network anywhere in the country.”
The 20 new live recognition cameras will be placed at 10 high-traffic locations and match faces against police databases in real-time. AI-powered crime detection capabilities, including weapon detection and automatic tracking of suspects and vehicles, will be added to 500 existing cameras. Another 50 cameras will be equipped with speakers, enabling police officers to issue warnings against anti-social behavior.
More importantly, the upgrade will enable the use of retrospective facial recognition across the CCTV network, allowing authorities to track suspects across the area. The upgrade plan also recommends introducing drones as an “enforcement aid,” according to The Standard.
Facial recognition expansion continues
The announcement comes amid a large push by the Met Police to increase the use of facial recognition for policing, including LFR.
This summer, the Home Office announced a rollout of 10 new vans equipped with LFR technology from NEC to seven police forces. One of these forces is the Bedfordshire Police, which deployed the technology in Bedford Town Centre last week. The vans are funded by the Home Office, while the next deployment is scheduled for Luton Town Centre on September 26th.
The UK’s first permanent live facial recognition system was set up this summer in Croydon, with analysis of the deployment expected to be released during the autumn. The neighborhood has become one of the most targeted areas for facial recognition deployments in the city.
UK police chiefs have asked the government to commit £220 million ($296 million) annually for the next three years to support technology projects, including live facial recognition rollouts. The use of surveillance tech, however, has been followed by concerns from biometric technology experts, privacy advocates and the public.
CCTV is far more intrusive than LFR
While everyone wants to talk about live facial recognition, “CCTV is far, far more intrusive,” according to Fraser Sampson, former UK Biometrics and Surveillance Commissioner.
“Live facial recognition, if it isn’t looking for you, it doesn’t even know you,” Sampson said in a Biometric Update podcast last week.
Cameras, on the other hand, capture people’s photos and sometimes recordings, and those records are kept for days, months and even years and can be accessed by government agencies and law enforcement, he notes.
Police and city authorities have argued that expanding facial recognition use is necessary to prevent crime. Hammersmith and Fulham Council leader Cowan, for instance, has noted that the cameras will help provide definitive evidence to courts, making them more efficient in a time when the criminal justice system is suffering through the consequences of austerity measures.
Last year, the borough’s cameras helped the London Metropolitan Police arrest 754 people. Between January and August this year, the police force has already made 634 arrests. The police rely on a 24-hour control room in which operators monitor live feeds from across the network.
Other critics, including digital rights organization Big Brother Group, fear that the technology is being deployed without limits and safeguards. Unlike the EU with its AI Act, facial recognition in the UK is not governed by a unified law but instead ruled by a patchwork of different regulations.
But facial recognition itself may not be the problem, according to Sampson.
UK govt expanding the use of technology despite concerns
Facial recognition has been a highly visible part of the UK government’s expanding use of technology. Earlier this month, Met Police Commissioner Mark Rowley proposed making it even more ubiquitous by installing facial recognition on police officers’ smartphones.
But the country has also been introducing other technology tools that have invited controversy and distrust, the New York Times reports.
This year, the UK introduced the Online Safety Act, targeting pornographic content and other material that could prove harmful to children. The law, which also introduced age verification for popular sites such as Instagram and Reddit, was met with backlash and accusations of overreach.
The government’s debates on introducing a national digital identity have also sparked fears of control and surveillance, while the introduction of AI tech, such as age assessment, for handling asylum applications has raised concerns among some government workers.
“If you don’t trust your police with new technology, the problem isn’t the technology,” says Sampson.
Article Topics
biometrics | cctv | facial recognition | London Metropolitan Police | police | real-time biometrics | UK | video surveillance





Comments