Will change in the White House sound a different tune for AI-driven biometrics in law enforcement

By Professor Fraser Sampson, former UK Biometrics & Surveillance Camera Commissioner
Flowing directly from the President’s pen, the Executive Order (EO) is a very individualised form of law making. There was even a Hollywood blockbuster named after this powerful legal device which doesn’t have an equivalent in the UK (no one’s going to make a film called The Digital Government (Disclosure of Information) (Identity Verification Services) (Amendment) Regulations 2024).
If Day One of the Trump administration lives up to its billing, the next announcement of EOs will arrive with a fanfare but the one the biometric community should keep their ears open for is the somewhat muted #14074.
Issued by President Biden in May 2022, Executive Order 14074 aims to advance effective, accountable policing and enhance public trust and safety. A joint federal agency report published last month reviews how biometric technologies are being employed by all law enforcement bodies across the United States under the terms of that EO. The report presents a detailed analysis of stakeholder views on federal biometrics and a set of ‘best practices’ for facial recognition technology (FRT) in policing.
Accompanying both documents is a key refrain: accountability. The UK may not have executive orders, but it very much shares some of the biometric challenges set out in the report and the need for accountability in the field of AI-driven capabilities in policing. Despite constitutional differences, police chiefs in both countries are currently united by two practical questions: “What does accountability for AI biometrics look like?” and “How do we achieve it?”
From the UK side, researchers believe they have answers to those questions, answers that will be of interest to the authors of the federal report. Drawing on the collective experience of police officers and staff, and a citizen survey of over 6000 people from countries across Europe, North America and beyond, a joint police and academic project (AiPAS) has identified the indicators of accountability in AI-driven technology. The team have also built a software tool to help policing bodies audit their policies and practices against these principles.
Combining the themes from the joint federal report and AiPAS, the accountability basics for AI-driven biometrics in law enforcement look something like this:
Lawfulness (follow the law). All aspects of AI biometrics must be lawful, and the burden of proving they are sits with the police. This includes compliance with AI-specific statutes (such as the EU AI Act), EOs and organisational policies. Lawfulness applies in every situation. The rest are lawfulness plus.
Completeness (leave nothing out). AI-driven biometrics are multi-partner programmes and accountability should cover everything from procurement to deployment, including partners and sub-contractors. Where there are gaps, the protection and promotion of fundamental rights and freedoms should prevail.
Inclusivity (leave no one out). Involve all stakeholders engaged in, and affected by the AI system. Build in diversity and reduce the potential for bias with broad participation by stakeholders, creating policy, reviewing deployment and looking for learning points.
Transparency (be open). Transparency promotes public trust and confidence, enabling those affected to make informed judgments. Accountability needs clear, meaningful accurate and timely (CMAT) information about AI biometrics systems (subject to operational sensitivities). Show why using the system is necessary and proportionate, highlighting foreseeable risks.
Impartiality (empower independence). Accountability bodies should be impartial without conflict of interest. External accountability – courts and regulators – has this in-built. Internally, people involved in accountability for AI biometrics should have independence from the line management structure of those involved in their design, development and deployment.
Proof (follow the evidence). The police know how to capture, analyse and present evidence. Evidence of accountability should mirror the standards of operational evidence gathering in terms of integrity, credibility and continuity, reflecting the impact of decisions to use – and not to use – available AI biometrics.
Enforceability and Redress (make it right). Without a ‘so what?’, accountability is ornamental. Accountability mechanisms must give people an effective remedy, including routes for complaint and challenge, internal enforceability and redress.
Compellability (make it work). Oversight bodies must have levers to make the accountability work. External compellability comes from legal or democratic processes, while internal policies should authorise action without the need for legal powers.
Explainability (describe, demonstrate, demystify). AI biometrics must be understood by the participants/audience. Explaining the system in a technical setting is one thing; making it understood in non-technical language so that the citizen and their representatives can understand, participate and challenge its use is harder but important.
Constructiveness (aim for better). Accountability is more than criticism. Participating constructively with a shared aim of improvement, considering different perspectives and inviting challenge will reveal how disagreement can lead to beneficial solutions.
Conduct (hold yourself accountable). The European Code of Police Ethics states, “the condition of a democracy can often be determined just by examining the conduct of its police”. Police ‘conduct’ will increasingly include the use of AI technology, both individual and organisational, engaging professional standards, values and expected behaviours which incorporate integrity and ethics.
Learning (look for the lesson). Promote willingness to improve AI in every respect through the application of (new) knowledge and insights. Applying to the design, use and oversight of AI biometrics, learning includes modification and improvement of systems, structures, practices, processes, knowledge and resources, as well as the development of professional doctrine, standards and policy.
The AiPAS team have offered to share their methodology, findings and compliance tool with joint federal agencies and will also be listening carefully for the mood music from the White House after 20 January.
Whatever becomes of EO 14074, the use of AI-driven biometrics in law enforcement will outlast us all. Capabilities such as FRT will advance police effectiveness, enhancing public trust and safety globally if we work under the banner of accountability to secure the blessings of our liberty. Continuing and collaborating across law enforcement communities would also exemplify another celebrated presidential observance: out of many, one.
About the author
Fraser Sampson, former UK Biometrics & Surveillance Camera Commissioner, is Professor of Governance and National Security at CENTRIC (Centre for Excellence in Terrorism, Resilience, Intelligence & Organised Crime Research) and a non-executive director at Facewatch.
Article Topics
AI Accountability for Policing and Security (AiPAS) | biometric identification | biometrics | facial recognition | Fraser Sampson | law enforcement | United States
Comments