UK House of Lords consider issues around biometrics, facial recognition
As the House of Lords Justice and Home Affairs Committee hears from experts how technologies such as facial recognition are “based on sketchy foundations,” it is calling for more contributions to its inquiry into the use of new technologies like predictive algorithms and biometrics in law enforcement.
The Committee has already begun hearing from biometrics experts both publicly and behind closed doors, while a separate bill on outlawing the use of facial recognition in overt surveillance awaits its second hearing.
The Justice and Home Affairs Committee is hoping to get a better understanding of the current deployment of advanced algorithmic tools to discover, deter, rehabilitate or punish people who breach the law in England and Wales. With an understanding of the current legal framework, it will go on to consider the development of such tools and the associated ethics.
In its call for evidence, the committee is seeking answers to questions including:
“Do advanced algorithms used in law enforcement contexts produce reliable outputs, and consistently so? How far do those who interact with these technologies (such as police officers, members of the judiciary, lawyers, and members of the public) understand how they work and how they should be used?”
The Committee has held three sessions so far in the ‘New technologies and the application of the law’ inquiry. Publicly available transcripts show committee members seeking to understand the concepts and how to find out more about the topics. Many of the answers from experts show just how poorly understood the issues are.
“Who can answer that question? I certainly cannot answer that question,” replies Professor Carole McCartney, professor of law and criminal justice at Northumbria Law School and researcher on biometrics and other technologies in the criminal justice system, to a question on bias and design of data sets in a 22 June hearing.
“What we do know, of course, is that one of the big criticisms of technologies, which we have been using for many years now—a lot of forensic technologies, a lot of biometric technologies—is their lack of scientific basis… Very often a lot of these technologies will be based on very sketchy scientific foundations, and that is dangerous.”
Professor McCartney goes on to criticize how data is handled and a gung-ho approach to trialing new technologies (reminiscent of the recent EDRi report into surveillance in Europe).
“Another issue is who is collecting the data that would be able to tell you whether or not these things worked, because the data to gauge accuracy, reliability or validity needs to be collected and analysed, and that is not happening.
“We have had trials, such as the South Wales Police trial on automatic facial recognition, and I think it is very problematic that these were trials. Essentially, they put the technology out into the wild and just keep an eye on it, and they call it a trial. That is not how scientific trials work.”
The call for further evidence is open until 5 September.
Meanwhile, a bill begun in the House of Lords proposes to make it a criminal offence to “operate, install, or commission the operation or installation of, equipment incorporating automated facial recognition technology capable of biometrically analysing those present in any public place in the United Kingdom.”
The Automated Facial Recognition Technology (Moratorium and Review) Act 2020, brought by Lord Clement-Jones, is awaiting its second hearing in the Lords. It also calls for a thorough review into the use of facial recognition technology in the UK.
Lord Holmes recently spoke to Biometric Update about other House of Lords explorations into digital identity.