FB pixel

The key to coherent biometrics laws? It might be found in people who’ve been wronged

The key to coherent biometrics laws? It might be found in people who’ve been wronged

The political left and right in the United States each now have their martyrs to indifferent and inaccurate facial recognition algorithms used by law enforcement.

So armed, it is possible to see the beginnings of a successful federal effort to regulate the software nationally.

A House subcommittee Tuesday listened to testimony from experts on the topic, including a Michigan father of two who was wrongfully thrown in a detention cell for 30 hours based on a false match.

Members of the Crime, Terrorism, and Homeland Security subcommittee, heard from eight witnesses, all of whom said facial recognition systems can help solve crimes, but feel they first must be more accurate, completely transparent, less biased and stringently regulated.

The witnesses ranged from Brett Tolman, executive director of Right on Crime, a conservative law enforcement lobby group, and Kara Frederick, a research fellow with the conservative think tank The Heritage Foundation, to Cedric Alexander, a former member of President Barack Obama’s 21st Century Policing task force, and Bertram Lee Jr., a policy counsel for The Leadership Conference on Civil and Human Rights.

In the mix was Robert Williams, a Farmington Hills, Mich., resident who told of being arrested on his front lawn in January 2020 by Detroit officers on federal larceny charges.

Williams, a soft-spoken witness, said no one would tell him why he had been scooped up, and learned from officers many hours after his arrest that he had been implicated in an alleged retail theft by “the computer” comparing photographs. It would be hours more before he was released, he said, without an apology or a satisfactory explanation.

Subcommittee members of both stripes nodded to his “compelling” testimony, but Republicans also brought up the case of a Homer, Alaska, couple whose house was searched for House Speaker Nancy Pelosi’s laptop, stolen during the Capitol insurrection last January.

Marilyn and Paul Hueper were led away handcuffed, for questioning by the FBI. At least one House member claimed they had been held at gunpoint during their interrogation, but no evidence of that has surfaced.

The pair was shown a photo of a woman inside the Capitol that, according to the Associated Press, left Marilyn wondering if someone had altered the original image.

She maintains that she was in Washington, D.C., for the Trump rally to overturn the vote, wearing the same coat as the photographed woman and sporting the same hairstyle, but that she never walked inside the building.

Both examples will likely highlight how few rules and standards exist nationally or locally for using facial recognition in police work. Multiple times, witnesses said there are not even reliable figures for which agencies and departments use algorithms.

The federal watchdog Government Accounting Office has recently issued a report, the first of its kind, inventorying facial recognition systems used by federal agencies. Among other things, investigators found that many are using private systems or subscription services, each of which could have any variation on the theme of privacy.

A non-governmental watchdog group called the Project on Government Oversight submitted written remarks warning that as a technology, facial recognition is not “monolithic” in functionality. Some algorithms are more biased and less accurate than others, and in the best of examples, poor original images yield poor results that should not be trusted.

For that reason alone, according to the project organizers, police should not use anyone’s face biometrics to generate leads for crimes, although there are numerous examples of officers doing just that.

Finally, the organization said, it would be insufficient for lawmakers to think that the dangers posed by this industry can be solved by addressing the errors of individual companies, coders or their products.

A clear and evolving set of standards and laws can and should be written to allow the technology to grow transparently, with accuracy and without bias, the group wrote.

Related Posts

Article Topics

 |   |   |   |   |   |   |   | 

Latest Biometrics News


Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

ID16.9 Podcast

Most Read This Week

Featured Company

Biometrics Research

Biometrics White Papers

Biometrics Events

Explaining Biometrics