UK police plan to introduce controversial biometric technology despite growing chorus of concern
An independent review of the use of face biometrics by UK police by the Ada Lovelace Institute will draw heavily from recent court findings on the use of live facial recognition by South Wales Police. Elsewhere in the UK, however, government funds have been allocated to a project to deploy computer vision biometrics for controversial affect recognition capabilities and searches by facial characteristics.
Earlier in August, the Court of Appeal ruled in the appeal of R (Bridges) v Chief Constable of South Wales that the use of biometric Live Automatic Facial Recognition (‘LAFR’) technology on crowds by South Wales Police was unlawful and needs a proper legal framework.
“SWP deployed AFR Locate on about 50 occasions between May 2017 and April 2019 at a variety of public events,” reads the press summary. “These deployments were overt, rather than secret… AFR Locate is capable of scanning 50 faces per second. Over the 50 deployments undertaken in 2017 and 2018, it is estimated that around 500,000 faces may have been scanned. The overwhelming majority of faces scanned will be of persons not on a watchlist, and therefore will be automatically deleted.”
In response to the decision, “the status quo cannot continue” reads an article written by Matthew Ryder QC and Jessica Jones from Matrix Chambers. Back in January, the Ada Lovelace Institute chose Matthew Ryder QC to run an independent review on biometric data governance and data misuse, also known as the “Ryder Review.”
“For every lawyer watching the case, the exceptional importance of the appeal was obvious even before the hearing,” write Ryder and Jones. “Not only were Mr. Bridges and the South Wales Police represented in the appeal, but also the Home Secretary, the Information Commissioner, the Surveillance Camera Commissioner, and the Police and Crime Commissioner for South Wales.”
There are three ways in which the South Wales Police’s use of facial recognition was unlawful, according to the Court of Appeal. First of all, the unit breached Article 8 which guarantees the right to privacy because the deployment was not “in accordance with law” and the legal framework displayed “fundamental deficiencies.” Second, the use of LAFR technology breached the Data Protection Act 2018 (DPA), while the police department failed to assess the overall risks to individual liberties and freedoms. Third, the use also breaches the public sector equality duty (PSED) whose goal is “to ensure that a public authority does not inadvertently overlook information which it should take into account.”
Key findings in the appeal include the court’s acknowledgement that LAFR is new technology that needs a different, more adequate legal framework because it is not similar to photographs or other CCTV use.
According to Ryder, a suitable legal framework and adequate impact assessments on individual rights are mandatory before LAFR can be used.
A police department in Lincolnshire, UK, however, wants to introduce an AI robot that leverages facial recognition and behavioral technology to detect emotion, The Times reports.
Lincolnshire police and crime commissioner, Marc Jones, has received funding from Home Office to deploy such a system in Gainsborough which would allow offices to search for vehicles, mood, expression, and to search for people wearing glasses, hats and different accessories. The system is not operational, as it still awaits the completion of an impact assessment on human rights and privacy.
West Midlands and Kent police are working with Home Office on a retrospective facial recognition study.
In December 2019, the AI Now Institute at New York University argued there was no scientific reasoning for the use of biometric “affect recognition.” Citing multiple flaws in the methodology for interpreting moods from facial expressions, the institute said the technology should be banned. At the time, it was estimated that the affect recognition market would reach $90 billion by 2024.
Face biometrics are estimated to surpass $15 billion by 2027 as verticals and applications expand.
Shortly before the Ryder Review was announced in January 2020, the London Metropolitan Police had announced similar biometric facial recognition trials in the city. Following complaints regarding privacy rights, the force later said it was looking into pausing the technology’s expansion.