U.S. watchdog wants law enforcement to follow basic biometrics rules to build trust
A U.S. government watchdog agency is pushing three policies, none of which should be a surprise, to increase trust in police use of forensic biometric algorithms.
As with almost all biometric systems, the promise of greater productivity and accuracy of forensic algorithms is matched only by concern about their unbiased reliability and trustworthiness.
Along with its recommendations, the Government Accounting Office has identified the most prevalent biometric algorithms used today forensic investigation tools: latent fingerprint, facial recognition and probabilistic genotyping.
This is not new ground. The GAO issued accountability-building rules for AI systems. The Security Industry Association is pushing suggestions to ward off moratoriums. And the European Union last year discussed a regulatory framework to build trust.
The GAO has developed — although, agency leaders surely mean to say borrowed — their options in the hope that fewer operational black eyes will win public support for the software’s use.
Agencies and private businesses first must build or borrow standards and policies governing biometric algorithm use. Carelessness and misuse need to be avoided and, when necessary, rapidly righted.
The potential scale of unwanted outcomes from forensic biometrics, including negating a person’s human rights, is more bracing than lesser, though still unfortunate, human missteps made in the course of doing a difficult job.
And some people who today sit on the sidelines when law enforcement takes controversial actions could join the opposition to policing if AI is involved.
The GAO also recommends increased training for everyone involved in creating and operating forensic algorithms. Avoiding bias, along with mistakes and misuse, is critical to automating a foundational pillar of democratic society.
The third recommendation is to maximize transparency at every stage of a forensic algorithm’s life cycle. The choice of data sets, tests, metrics, goals and corrective actions have to be as public as possible without, obviously, opening up the systems to tampering or gaming.
Being honest about the potential mistakes and open about misuse, and the procedures to minimize these inevitabilities, can build trust.
All three recommendations would aid in increasing trust in police work at a time when it really needs it.
Fingerprint algorithms are most familiar to those in the government, industry and the public, and of the three prevalent biometric algorithm types, it is oldest. And yet, something as simple as a print’s orientation in relation to a stored image can tank some vendors’ matching algorithms, according to the GAO’s report.
Facial recognition has its own collection of advantages and disadvantages, and would benefit most in the short term when it comes to demystification.
Probabilistic genotyping is the most esoteric of the three, and is a trickier tool than even face biometrics. The GAO says probabilistic genotyping can increase the variety of DNA evidence that can be analyzed in fighting crime.
Being probabilistic (as is most automated biometric tools), it cannot always do what many people think automation does — offer a binary answer to questions. Instead, algorithms typically assign numerical scores for multiple possibilities.
That makes facial recognition potentially problematic, and in the case of faces, people can see for themselves if results seem sound. That is not the case when working with DNA analysis.