FB pixel

Do the police ever forget a face?

Do the police ever forget a face?
 

By Professor Fraser Sampson, former UK Biometrics & Surveillance Camera Commissioner

There is a neurological condition called prosopagnosia which the National Center for Biotechnology Information defines as “the inability to recognize known and new faces”. Some published studies of the condition – also known as face blindness – describe symptoms as including impairment of ‘facial recognition’, ‘face recognition’, ‘face identification’ and the ‘forgetting/not remembering of faces’.

Is there a difference between face recognition and facial identification, between forgetting and not remembering a face? In medical terms perhaps not but in biometric terms – particularly in law enforcement – there is, and it can be profound.

Many cameras can recognise a human face as distinct from, say, a logo or a book. Next time you use your smart phone to take a picture look out for those yellow (if you’re using an iPhone) squares that instantly form around any faces in view. This function is also known as face detection; in many devices you can switch it off. This basic ability to distinguish a face could be described as face recognition. It will be of limited use in a law enforcement context, beyond perhaps searching or scanning for signs of human presence. But being able to identify what appears to be a human face is not the same as being able to identify the person whose face it might be. Neurologically face identification may be similar to detection; biometrically it’s very different. Before it can ‘recognise’ who you are, a camera needs to have something precise to compare your face with, a process better described as face matching. Face matching is what’s being done when your device wants to confirm your identity for access to your personal account with an online service provider or to unlock your phone. In that context it’s a 1:1 process where the device matches your image with the one planted in its ‘memory’. The police use this, for example, to confirm that the driver in the vehicle they have stopped is in fact the person they are claiming to be. The image being matched against the face is stored somewhere in the police memory bank, perhaps on a database people who are wanted or convicted.

Face matching can be undertaken on a 1-to-many basis, for example when the police compare a picture with images of victims and offenders from seized recordings in child sex abuse investigations. Sometimes there may still be no information as to the identity of the person, only a confidence assessment that the person in video X is the same person as the one caught on dashcam Y or security camera Z. With this face matching there’s no risk of ‘forgetting’ the face: if it’s on the database it’s remembered.

Beyond basic matching is the process whereby a camera detects faces of people passing by and compares them instantaneously to a lineup with which it has been programmed. This is what people often have in mind when talking about Facial Recognition Technology (FRT). In this instance of ‘live’ FRT the camera is actively matching passers-by with faces that have been shown to it, a bit like a police dog looking for someone giving off the same scent as the item of clothing just waved under its nose. If there is no match between passing face and programmed gallery, the image is not retained and the system has no further information about it. In other words, the camera has recognised your face as a face but can’t identify you – now or in the future – because it never ‘knew’ who you are. It cannot ‘forget’ what you look like for the same reason. This form of digital prosopagnosia is designed into Live FRT cameras currently used in UK policing and it’s arguably far less intrusive than a constable snapping pictures of people in the street and keeping the prints.

Can your face get into the memory of a police FRT system any other way? Much controversy about live FRT is from people finding themselves on a ‘watchlist’ without having a police record. There are people in England and Wales – several million by recent estimates – who were once arrested but were never proceeded with – some have found their way onto live FRT watchlists. Does this mean the police can never forget a face and are allowed to match against anyone that ever passed through their custody? Not according to the High Court but many (most?) of those faces and personal identification data are still in the police memory bank. If they want their face to be ‘forgotten’ by the police, the person must apply for the image to be deleted.

The state has other ways of recognising your face if all the relevant synapses are working together. A current example is the UK database of driver’s licences which are causing some concern by being linked to new powers in the Crime and Policing Bill. Passport photographs can also be searched against. And of course, the largest source of facial images is ourselves. Uploaded selfies and social media posts contain billions of faces, sharing our memorable (AKA matchable) moments in a way that has yet to sink it. Should the state be able to match wanted people against those? The EU AI Act bans internet scraping but why shouldn’t the police cast their investigative eye across a virtual crowd to see if their algorithm recognises anyone?

As AI-enabled (AI-driven, AI-equipped, AI-enhanced?) face matching capabilities evolve, these distinctions and questions will become more important. Algorithms of the future won’t just match against images but inferences. Using genealogical information, they will predict what you will look like before there’s a you to recognise and zero-shot learning means they can ‘recognise’ faces they’ve never seen before. What will we call that?

It’s not neuroscience but, when it comes to facial cognition, we need some new labels.

About the author

Fraser Sampson, former UK Biometrics & Surveillance Camera Commissioner, is Professor of Governance and National Security at CENTRIC (Centre for Excellence in Terrorism, Resilience, Intelligence & Organised Crime Research) and a non-executive director at Facewatch.

Related Posts

Article Topics

 |   |   |   |   | 

Latest Biometrics News

 

Calls for national standards grow as U.S. AI action plan takes shape

On February 6, the National Science Foundation’s (NSF) Networking and Information Technology Research and Development National Coordination Office (NCO) issued…

 

DOGE’s influence at SSA triggers legal and congressional scrutiny

An affidavit in support of an amended complaint and motion for emergency relief to halt Elon Musk’s so-called Department of Government Efficiency’s…

 

UK Online Safety Act passes first enforcement deadline, threatening big fines

One of the main reasons regulations are not especially popular among ambitious CEOs is that they can cost money. This…

 

Digital ID, passkeys are transforming Australian government services

Tax has gone digital in Australia, where businesses now need to use the Australian Government Digital ID System to verify…

 

Biometrics ‘the lynchpin of where gaming companies need to be,’ says gambling executive

Online gambling continues to be a fruitful market for biometrics providers, as betting platforms seek secure and frictionless KYC, onboarding,…

 

Surveillance, identity and the right to go missing

By Professor Fraser Sampson, former UK Biometrics & Surveillance Camera Commissioner Do we have a right to go missing? The global…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events