FB pixel

Cops made poor training wheels for AI in UK face biometrics trials

 

south-wales-police-facial-recognition-vehicles

Autonomous face biometrics does not have to be perfect yet, say proponents. Especially with systems used by police, humans can be active partners until algorithms are perfected.

That is an attractive argument for vendors and nervous politicians trying to assuage growing public distrust of facial recognition and advocates. But opponents argue that this is not true now, and may never be.

A new study supported by grants from the UK government South Wales Police and published in The British Journal of Criminology makes the case that based on two large UK system trials, fallible humans today significantly influence biometric algorithms — and not always for the better.

The biases begin with how AIs are created, of course, but they also play integral roles in the operation of so-called assisted systems. Everything from where to set up cameras in public spaces to who gets picked up for questioning.

In fact, given the myriad variables involved in the use of face biometrics by police and, indeed, involved in competent and fair policing (which successful software would obviate), the autonomous system closest to perfect might still be on the most distant horizon.

In this study, algorithms were trained to spot people in crowds and issue an alert if they could match a captured face with their data sets of suspects.

The study’s authors described numerous points where humans in trials had to put their finger on the scales.

Looking at success rates for many event deployments, they wrote, “results follow no particular pattern.” The variability is largely the result of human involvement, according to the study.

The numbers for one trial in 2018 to 2019, were not encouraging. The number of system matches that an officer deemed credible ran from 33 percent to 100 percent. But the total matches proved correct after a subject’s ID was checked was zero on two dates and peaked at 44 percent.

“[O]ne striking finding is the tendency of officers to agree with (cede discretion to) the algorithm and the high chance that computer-generated matches would not be verifiably correct,” the authors wrote.

On the other hand, there were observed instances of officers discounting alerts after a false positive. That means their professional suspicion shifted measurably away from the faces in the crowd due to the algorithm.

In all cases, watch lists of previously convicted criminals, suspects and persons of interest were hand picked by officers. It was up to officers to decide what degree of criminality qualified a person for inclusion on a watch list.

The term person of interest itself is subjective and would differ among officers.

It is possible that officers already knew who might pop up at a given event, reducing randomness but also biasing algorithms toward those with criminal records, people of color and younger people.

The police could change the threshold at which alerts were issued, too. Matches were scored on a range from zero to one, and those scores, according to the report, directly influenced human decisions.

The higher the score, the more likely an officer was to assume an algorithm was correct in issuing the alert. It was black-and-white reasoning using a scale that is designed to be a gradient consulting tool.

Siting cameras should be straight forward enough, but not only is it impossible for an algorithm today to position lenses, systems still demand minimum conditions, including adequate lighting and views of faces.

One event targeted stretched after dusk, according to the study, when pick-pockets were most active. Needing more light for the system, officers decided to monitor during the day instead.

In another instance, cameras were placed in a mall a half-mile from a crime hot spot because it was easier to do so.

Then there was event at which cameras were placed where police could quickly nab suspects, a spot distant from where most of the crimes were occurring.

And in an example of how unpredictably fickle human influence can be when deploying face biometrics, the authors recounted an instance when a camera was placed to best spot enrolled faces — the watch list — during an event. An unsuspecting flag vendor set up a stand too near the camera, and a fluttering Italian flag occasionally interfered with system’s line of sight.

Officers considered asking the entrepreneur to move, but ultimately decided not to because people looked up at the flag, affording the system and operators a better (intermittent) view of their faces.

The degree of decision-making and effort required by police to achieve reasonably effective live facial recognition deployments is such that the technology’s role is better understood as “assisted,” rather than “automated facial recognition,” the report concludes.

Article Topics

 |   |   |   |   |   |   |   |   | 

Latest Biometrics News

 

Telefónica Tech launches self-sovereign identity tool based on verifiable credentials

Telefónica’s digital business unit, Telefónica Tech, has announced the launch of a self-sovereign identity product based on verifiable credentials. A…

 

OmniSpeech targets voice deepfake detection market with Zoom integration

Maryland-based OmniSpeech has released its audio deepfake detection software to the Zoom Marketplace to secure video calls against AI-powered fraud….

 

No false alerts in UK POC of live facial recognition for immigration enforcement

A single arrest was made following a live facial recognition match at a UK port during a November proof-of-concept trial,…

 

Scottish police invite public feedback on law enforcement use of biometric tech

Scotland’s police authorities are inviting the public to share their views on an upcoming strategy for the use of biometrics…

 

Select ID certified Orchestration Service Provider under UK DIATF

Select ID has been certified under the UK’s Digital Identity and Attributes Trust Framework (DIATF) as an Orchestration Service Provider…

 

In 2026, 90% of Australian population will have access to digital drivers licenses

Digital driver’s licenses continue to see “middling uptake” in Queensland, according to new figures cited by InnovationAus, even as the…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events