FB pixel

Policing and facial recognition: What’s stopping them?

Policing and facial recognition: What’s stopping them?
 

By Professor Fraser Sampson, former UK Biometrics & Surveillance Camera Commissioner

“If it’s legal and it works, what’s stopping them?”  Police use of Facial Recognition Technology (FRT) generates a lot of material and this question comes up a lot. As the UK’s Biometrics and Surveillance Camera Commissioner I’d be asked a variation of this FAQ in drive-time phone-ins, police force inspections, online conference forums and weekly interviews with journalists — and it hasn’t gone away. It deserves an answer just for sheer persistence alone but, before that, the question itself needs some work.

The question contains two “ifs” and a presumption; all are carrying a lot of weight. The first “if” is the legal basis for using FRT.  Do the police have the power to use it?  In England and Wales the police certainly have statutory powers to take and retain images of people, along with common law powers to obtain and store information about the citizen’s behavior in public. The government’s own Surveillance Camera Code of Practice (currently on policy’s death row) provides guidance to chief officers on how to do this and on operating overt surveillance systems in public places generally.  The Court of Appeal found a “sufficient legal framework” covered police use of FRT, one that was capable of supporting its lawful deployment. However, legal questions are rarely binary and there are many niceties to cover off before the police can say their use of technology is fully in accordance with the law. General data protection laws add important layers to intrusive police activity, as do obligations around covert surveillance, equality and interference with fundamental rights like freedom of speech, assembly and so forth.  All of these attract guidance from, and vigilance by regulatory bodies and all must be taken into account if the police are to claim that their use of the technology is, in every case, “lawful.” This area needs great care; even ministers have been confused about the law as Matthew Ryder KC evidenced in his independent review for the Ada Lovelace Institute. An express power to do something is not the same as an absence of specific laws preventing it and, in this context, we sometimes have more absence than expression.

The second “if” relates to the technology i.e. “if FRT works, what’s stopping the police from using it?”  Since a shaky introduction around 2015 when it didn’t work as hoped (or required) police facial recognition technology has come on significantly. The accuracy of the technology is much better but is it accurate to say it now “works”?  Each technology partner and purchasing police force must answer that for themselves – as for any other operational capability.  That’s accountability. But in AI terms, 10 years ago is the Pleistocene era and bearding police chiefs with first generation statistics is jousting with fossils. Accountability for using of state-of-the-art, AI-driven technology requires up to the minute evidence, something that will be critical for all uses of AI by the police. At the Centre for Excellence in Terrorism, Resilience, Intelligence and Organised Crime Research (CENTRIC) we are working with policing and government bodies to create tools for that very purpose.

The question also assumes that the police are holding back from deploying FRT. Is this a fair challenge? When people ask what’s stopping the police using FRT, they usually mean live (or real-time) facial recognition but there are other applications of the technology. The take up of retrospective FRT – in many ways its least contentious application – has been more extensive than live matching and its use is widespread across policing in England and Wales. Live FRT, however, has had a cautious roll out.  Why?  Three key perspectives when answering surveillance questions are: technological (what can be done), legal (what must/must not be done) and societal (what people support/expect to be done). Applying these we can see the question is asking: “as it’s possible and permissible, what’s stopping the police using live facial recognition?”  The answer may well lie in the third perspective. The extent to which police use of live FRT is acceptable, and what people expect the police to do with it, may reveal more about their response to the undisputed potential of the technology. While academic research shows some ambivalence, the public attitude to police using live FRT has not really been tested in the UK, either directly by the police and their governance bodies or through parliamentary rigor.

A couple of police forces were early adopters of live FRT, shouldering much of the investment risk and most of the justified blowback. The Metropolitan Police Service was one of them and their Commissioner, Mark Rowley, reportedly believes that, not only do the public support his use of FRT, but also that its impact could help restore their trust in policing. The experience of his fellow early adopters in the retail sector may corroborate that belief.  Trusted brand stores up and down UK high streets using FRT are seeing significant drops in crimes against their staff, customers and stock. Maybe the police should be asking what’s stopping them.

About the author

Fraser Sampson, former UK Biometrics & Surveillance Camera Commissioner, is Professor of Governance and National Security at CENTRIC (Centre for Excellence in Terrorism, Resilience, Intelligence & Organised Crime Research) and a non-executive director at Facewatch.

Related Posts

Article Topics

 |   |   |   |   |   |   | 

Latest Biometrics News

 

Opinions on UK Online Safety Act emphasize importance of enforcement

Online safety legislation is making headlines around the world. But in places where laws have taken effect, are they proving…

 

UK Home Office raises estimate for passport contract to 12 years, £576M

The UK Home Office has opened a third round of market engagement for its next major passport manufacturing and personalization…

 

US lawmakers move to restrict AI chatbots used by kids

A bipartisan pair of House and Senate bills would impose new federal restrictions on AI chatbots, including a ban on…

 

Utah age assurance law for VPN users takes effect this week

Privacy advocates and virtual private network (VPN) providers are up in arms over Utah’s Senate Bill 73 (SB 73), “Online…

 

CLR Labs wins ISO 17025 accreditation for biometrics testing across EU

Cabinet Louis Reynaud (CLR Labs) has been accredited for ISO/IEC 17025, the international standard for testing and calibration laboratories, in…

 

Leidos, Idemia PS advance checkpoint modernization with biometrics, CAT-2 systems

Leidos and Idemia Public Security have formed a strategic partnership to deploy biometric‑enabled eGates and integrated Credential Authentication Technology (CAT-2)…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events