Ada Lovelace Institute questions legality of facial recognition in UK

Deployments of biometric technologies, such as facial recognition and emotion recognition, may not be lawful in the UK, according to a new study from the Ada Lovelace Institute. The independent research organization also says that the UK is in “urgent” need of an alternative approach to governing biometrics and proposes a risk-based rulebook similar to the EU’s AI Act.
The UK currently lacks a unified regulation governing biometric systems, instead relying on a patchwork of different legislation. This makes it difficult to understand whether their deployments are happening under a “sufficient legal framework,” the London-based organization argues.
The report specifically mentions live facial recognition (LTR), which has been used by law enforcement agencies to capture criminals at public spaces. Other applications are even less regulated, including private sector surveillance such as retail and police use of retrospective facial recognition, the report notes.
“Oversight and guidance for all other biometric technologies and deployment types […] is significantly less mature than existing guidance for LFR, making such deployments even less likely to be lawful,” says the paper, published on Thursday.
The Institute also notes that the country “urgently” needs an alternative approach to governing biometrics and outlines recommendations to improve the biometrics regulatory framework, including the establishment of a single regulatory body.
LTR on shaky legal grounds
Biometric systems have been popping up across the country, including in supermarkets, schools, music venues and sporting events. Aside from facial recognition, Brits are currently seeing a new generation of invasive inferential biometric technology such as emotion recognition. UK’s railway company Network Rail, for instance, has used Amazon’s emotion recognition at train stations despite dubious scientific evidence about its efficiency.
Live facial recognition has been one of the most controversial applications. According to a legal analysis by the Ada Lovelace Institute, guidance and standards issued by police bodies do not fully cover the mandatory practices related to LTR set out by the courts.
In 2020, the UK Court of Appeal Civil Division ruled that the way that the South Wales Police used real-time facial recognition is unlawful under the European Convention on Human Rights (ECHR)’s rules on privacy and freedom of assembly. The case was brought by Edward Bridges, who was recorded by facial recognition cameras in Cardiff in 2017 and 2018, including while attending a public protest.
The court also set a list of mandatory standards and practices for police use of LFR. However, when the Ada Lovelace Institute compared the 2020 Bridges ruling against the College of Policing’s Live Facial Recognition Guidance and the Metropolitan Police Service Overt LFR Policy Document, it found that the police guidelines are lacking.
“These pieces of guidance address some but not all of the mandatory practices outlined in the Bridges ruling making the lawfulness of current LFR use unclear – particularly newer deployment types such as permanent LFR camera installations,” says the report.
From ‘diffused’ to ‘centralized’ biometric governance
The Ada Lovelace Institute argues that the UK’s current “diffused” biometric governance model, which combines different regulations and standards, is no longer enough. Instead, the country must work on a comprehensive legal framework that covers police use, private sector surveillance and inferential biometric systems, or emotion recognition.
Similar to the EU AI Act, the country’s future biometric rulebook should be “risk-based,” categorizing biometric systems in different tiers according to their risk to fundamental rights, says the organization.
Policymakers should also specify safeguards, such as transparency and notification requirements on vendors and deployers, obligatory technical standards related to efficacy and discrimination and system testing and monitoring.
“Different measures would only apply to biometric applications posing a relevant level of risk, ensuring proportionality of compliance requirements,” says the paper. “Risk categorization and safeguards may need to be adaptable depending on the deployer (e.g. public vs private sector) as occurs with other frameworks such as GDPR.”
Among other improvements is adopting a definition of biometrics that would both ensure consistency with existing legal frameworks and allow new categories of biometric data and system definitions to be added and removed over time.
Another recommendation from the Ada Lovelace Institute is establishing an independent regulatory body for biometrics. The country has been waiting to fill its Biometrics and Surveillance Camera Commissioner amid a regulatory overhaul.
The regulator would have the power to develop codes of practice specific to use cases. The agency would also be in charge of outlining deployment criteria for police use of facial recognition, including live, operator-initiated and retrospective use.
The research organization is not the only one proposing an overhaul of biometrics regulation. In a paper published in February, the Biometrics Institute called for a clearer and more consistent framework for governing facial recognition in public spaces. Despite these calls, the UK police and the Home Office have been resisting calls for transparency.
Article Topics
Ada Lovelace Institute | biometrics | face biometrics | facial recognition | London Metropolitan Police | UK
Comments