FB pixel

Those wronged by facial recognition join rights groups in calling for police ban

Racial bias and lack of accountability cited as major problems
Those wronged by facial recognition join rights groups in calling for police ban
 

There is a small but growing lobby made up of people who have been wrongly arrested as a result of facial recognition technology. Among them is Robert Williams, an American who was handcuffed in front of his family in 2020 after police facial recognition misidentified him as a suspect in a federal larceny case.

Williams is now calling for police forces in Ireland to scrap their plans to deploy the biometric tech. In comments made at an event in Dublin hosted by the Irish Council for Civil Liberties (ICCL) and issued in a release, Williams points to the risk that comes with using tools that are prone to misidentify people of color.

“Federal studies have shown that facial recognition systems misidentify Asian and Black people up to 100 times more often than white people,” Williams says. “In America, we’re trying to undo the harms that FRT has already done. Here in Ireland, you have an opportunity not to introduce it in the first place. I hope your government will listen to experiences like mine and think twice before bringing FRT into policing.”

Williams refers to a 2019 report from NIST, which has since been updated, showing that some algorithms were 10 to 100 times more likely to misidentify a Black or East Asian than a white face. Not all of the algorithms evaluated are in commercial production, however, and others were found to have imperceptible differences in performance between demographics, prompting NIST Biometric Standards and Testing Lead Patrick Grother to urge those implementing facial recognition to be specific in evaluating bias.

Williams’ statement on the U.S. could also be debated, given the uptake of facial recognition technology by law enforcement agencies across the country. And while it is true that Irish police could still decide to pass on facial recognition, it is unlikely. The government is in the process of drafting legislation that would give Gardaí access to FRT. And police in the neighboring UK have embraced facial recognition with aplomb.

Nor is it merely an island thing. Police in Sweden are currently pushing against the limits of the still-fresh AI Act with plans to deploy 1:N facial recognition in public spaces. And Canadian police recently contracted Idemia to provide facial recognition services.

Almost everyone facial recognition has misidentified in the US is Black

Pushback continues, however, on both sides of the Atlantic. A release says the Center for American Progress (CAP), the Lawyers’ Committee for Civil Rights Under Law and 14 other civil rights and advocacy groups have sent a letter to the National Institute for Justice (NIJ) regarding the use of AI in the criminal justice system, in response to the NIJ’s request for information ahead of a forthcoming report on the matter.

The letter does not pull punches. ​​

“Black people and other people of color face significant civil rights harms resulting from law enforcement’s use of AI,” it says. “Algorithmic technologies used by law enforcement agencies replicate and reinforce existing racial bias and discrimination; policing technologies such as facial recognition, risk assessments, and predictive policing tools often enable and accelerate this trend.”

The coalition says biases are “baked into the development and deployment” of AI-driven biometric tools. It points an accusatory finger at databases used to train AI engines – “vast troves of data that are rife with inaccuracies and reflect existing societal biases and inequities.” Furthermore, statistics show that the human police who use facial recognition are also prone to bias.

“For instance,” the letter says, “a recent report found that the New Orleans Police Department disproportionately used facial recognition technology (“FRT”) to identify Black individuals; they deployed the technology on Black people over 90 percent of the time.”

All but one of the seven of the people in the U.S. who have been misidentified by facial recognition and wrongfully arrested are Black.

Racial bias is not the only issue. The coalition also says transparency is lacking. “Shrouded in secrecy, law enforcement agencies use AI technologies – ranging from police surveillance to suspect identification and prison-management tools – without meaningful public input or oversight.” Without transparency, there is no accountability.

The letter concludes with four recommendations on what the NIJ should address in its forthcoming report. They read more like demands.

First, “law enforcement’s use of racially discriminatory technologies, including FRT and predictive policing tools, should be prohibited.”

Second, “the use of algorithmic surveillance tools should be prohibited in public places and in any setting that could chill the exercise of First Amendment rights.”

Third, “Independent pre- and post-deployment audits of AI technologies used for law enforcement purposes should be required.”

Finally, “law enforcement use of AI technologies should be disclosed to defendants in criminal cases.”

Related Posts

Article Topics

 |   |   |   |   |   |   |   |   | 

Latest Biometrics News

 

Innovatrics, ROC improve rankings in NIST ELFT, rising to 2 and 3 respectively

Innovatrics is celebrating success in the latest National Institute of Standards and Technology (NIST) Evaluation of Latent Fingerprint Technologies (ELFT)…

 

Meta plans launch of facial recognition to smart glasses in ‘dynamic political environment’

Meta is reportedly planning to roll out facial recognition capabilities for its smart glasses as early as this year, taking…

 

Australia’s eSafety Commissioner stands firm in face of US demands

For a few weeks, there wasn’t much news about how U.S. Congress has demanded that Australian eSafety Commissioner Julie Inman…

 

Mobai joins Norway’s new digital fraud research center

Facial biometrics company Mobai will be the main biometric technology partner in a new research center focused on combating fraud…

 

Signicat to offer B2B identity verification for online merchants

Signicat has taken another step toward its goal of becoming a major digital ID provider in the European B2B space….

 

UK data watchdog responds to govt consultation on police use of facial recognition

The UK’s Information Commissioner’s Office (ICO) wants clearer rules for law enforcement agencies using facial recognition technology (FRT) and other…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events