Letter to Pelosi: Cut facial recognition spending from appropriation bills

facial-recognition-database

Democrats in the U.S. House of Representatives seem to be making biometric surveillance a fortified position for the November general elections and beyond.

A coalition of Democrats (with names that appeared on two previous surveillance-related letters) has written to House leaders asking that funding for facial recognition and similar activities be prohibited from fiscal 2021 appropriations bills.

This follows a June communication in which Democrats demanded answers from the executive branch about biometric surveillance programs reportedly being used on peaceful protesters by the FBI, Customs and Border Patrol, the National Guard and the Drug Enforcement Agency. That note was followed by letters this month to two of those agencies with more pointed questions.

The House Oversight Committee has voiced concerns as well, in this case mostly focused on the use of facial recognition applications by law enforcement.

Also in June, House Democrat Eddie Bernice Johnson of Texas, introduced a bill that would require new, multi-year research by the National Institute of Standards and Technology (NIST) into how facial recognition is being used and how bias is being eliminated from related software.

In the weeks prior to the street protests, the International Biometrics and Identity Association discussed a lobbying effort to win what they feel are responsible facial recognition regulation.

The newest Congressional letter, sent July 17 to House Speaker, Democrat Nancy Pelosi and minority leader, Republican Kevin McCarthy, narrates a brief list of mistakes with serious consequences involving police use of facial recognition.

The letter writers seem to overstate one point, however.

They state that research by NIST has found facial recognition systems mismatched Black and Asian faces “10 times to 100 times” more often than white faces, but that point is a misrepresentation.

NIST has an open-ended project that reviews face-scanning algorithms submitted by developers, and project researchers have found that quality varies from the near-comical to software that has “undetectable” levels of bias.

A Biometric Update interview with NIST’s biometric standards and testing lead, Patrick Grother goes into more detail here.

The letter also highlights a 2018 test by the American Civil Liberties Union of Amazon’s Rekognition software that appeared to confuse 28 House members with faces in an arrest database.

Amazon executives followed up, saying the default confidence setting used by the ACLU was inappropriate to the use case, and that the organization misinterpreted the results.

Reacting to growing public concern about facial recognition use by law enforcement, Amazon in June imposed a one-year moratorium on Rekognition use by law enforcement agencies.

The list of serious mistakes in the real world is growing as is discomfort among the biometrics vendor community.

An article in Vice found alarming problems with a scanning system used by the Detroit Police Department. Not all of them were technical glitches, either.

Sixty-eight of 70 times DPD officers used their system — from DataWorks Plus — during the first half of this year, a photograph of a Black person was scanned in. That could be a case of operator bias even before a scan starts.

Most of the pictures came from social media posts or security cameras.

Related Posts

Article Topics

 |   |   |   |   | 

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Read This Week

Featured Company

Biometrics Research

Biometrics White Papers

Biometrics Events

Explaining Biometrics