Hammersmith and Fulham builds on largest CCTV network per person in UK
An update filed with the Social Inclusion and Community Safety PAC by the West London borough of Hammersmith and Fulham details the “work and progress of the borough’s £5.4 million capital investment programme for CCTV,” building on what is already the most extensive CCTV network per person in the UK.
“We are proud to have the highest number of cameras per head of population in the country and, with our upgrade programme now at its midpoint, our previously advertised camera number of circa 1,800 cameras across the borough has increased further to over 2,000 cameras in the public realm and across our housing estates,” says the update.
Listed uses for the surveillance camera network include deterring antisocial behavior, preventing street crime and illegal dumping, and observing “unlicensed activity” in entertainment premises, among others. The report says operators captured 4,896 incidents and worked directly to assist the Met to secure the arrests of 535 people.
“The ambitions of the service are to constantly evolve and become better,” says the borough. “With the upgrade programme at its midpoint, we are entering into an exciting new era. We are keen to add further functionality and offerings for traded services of the CCTV whether via business, regeneration, commercial or local authority contracts.”
The £5.4 million investment for CCTV, of which the borough has to date spent £1.9 million, runs through 2025/26. It is “designed to improve and grow our CCTV offer alongside improving the services resilience and enhancing the use of new and/emerging technologies to place Hammersmith and Fulham at the forefront of innovation and service delivery.”
Director of Public Protection Neil Thurlow, who is responsible for the report, has clearly specified that this does not include facial recognition. It will, however, integrate AI. An article in MyLondon quotes Thurlow as saying AI will be tested against the “highest ethics” and that “it’s not going to be used for spyware or anything like that at all.”
Rights coalition calls for outright ban on predictive policing
While the update says Hammersmith and Fulham’s service “has much to be proud of,” some disagree that an increase in surveillance is something to celebrate. Other boroughs have also rejected the use of live facial recognition cameras for biometric surveillance. The council for Islington has called on the Mayor of London, Sadiq Khan, to impose a facial recognition ban on government agencies operating within the borough’s borders.
Deployments by law enforcement are among the most controversial. In Bedfordshire, police use of facial recognition to surveil the Bedford River Festival raised hackles among advocacy groups. A BBC report says the face biometrics system, matched against a watch list, was meant “to locate the county’s most wanted offenders and keep the public safe”
Campaign group Liberty calls the use of facial recognition in Bedfordshire a “grave concern.” It is among signatories to a letter from the Open Rights Group to the Secretary of State for the Home Department Yvette Cooper, calling for a total ban on the use of algorithmic predictive policing systems to profile or assess the likelihood of criminal behavior in certain people or locations.
“Many AI systems have been proven to magnify discrimination and inequality,” says the letter. “In particular, so-called ‘predictive policing’ and biometric surveillance systems are disproportionately used to target marginalized groups including racialised, working class and migrant communities. These systems criminalize people and infringe human rights, including the fundamental right to be presumed innocent.”
The coalition calls on an outright ban on AI-based predictive policing systems. It also demands increased regulations, safeguards, accountability and transparency; independent oversight; “meaningful human involvement in decisions” and a “clear route to challenge and robust redress.”
These systems criminalize people and engage and infringe human rights, including the right to a fair trial and the presumption of innocence, the right to private and family life, and data protection rights,” it says of predictive policing. “These systems must therefore be prohibited.”
In a response to the letter quoted in the Daily Mail, a spokesperson for the Scottish government says “the decision to use any technology with facial recognition capability is an operational matter for Police Scotland, which abides by all relevant laws, including the Scottish Biometrics Commissioner’s statutory code of practice.”
Article Topics
biometrics | cctv | criminal ID | facial recognition | police | UK | video analytics | video surveillance
Comments