Everyone in facial recognition wants to impress US civil rights commission
More than a dozen people met this week with the U.S. Civil Rights Commission to talk about how facial recognition can impact the core lives of Americans. Many voices cautioned against a hands-off stance by commissioners.
Some speakers shared insights they felt would help the commission wrestle with the technology and its implications. Others, from the bureaucracy, showed up to the hearing to explain what their agency is doing regarding biometrics.
And one conveyed a surprising message: All’s well. False alarm.
The following is a bulletin-style summary of what was said by speakers most notable to the biometrics community. Everyone’s comments are here, and most of them are well worth reading in whole.
First, the people trying to raise the commissioners’ awareness.
Michael Akinwumi, chief responsibility officer of the National Fair Housing Alliance: Equity in housing has large and direct impacts on the national economy, which should caution against turning affordable housing into invasively and racially surveilled compounds that drive off deserving consumers.
Michelle Ewert, director of the Washburn Law Clinic in Washington, D.C.: “Because of the potential for errors and the invasion of tenants’ privacy rights, it is imperative that the federal government take steps to protect subsidized tenants from harmful FRT.” She, in fact, wants the commission to join her in calling for contracts with housing providers that prohibit the businesses from using facial recognition on properties.
Clare Garvie, training and resource counsel for the National Association of Criminal Defense Lawyers and “The Perpetual Line-up” co-author: “Facial recognition technology risks entrenching historical racial biases in the criminal legal system based on several interrelated factors: the disproportionate use of the technology in communities of color, an extension of historical over-policing; the disproportionate enrollment of people of color — particularly Black men — in facial recognition databases through reliance on mugshot databases that reflect both the legacy and ongoing reality of over-policing; and the demographically-based differential error rates that the technology continues to exhibit. Second, current police use of facial recognition lacks scientific validity.”
Nicol Turner Lee, senior fellow, governance studies and director of the Center for Technology Innovation at the Brookings Institution: The commission should not wait for any other sector of government to set guardrails. Lead the movement. The reliability of face recognition as applied as a forensic tool is not a foregone conclusion. Research into how facial recognition affects all relevant areas of law enforcement “is absolutely imperative.”
Here’s where the narrative begins to change to “look what we do.”
Arun Vemury, senior engineering advisor for biometric technologies in the Homeland Security Department’s Science & Technology Directorate: “By some measures, face recognition may be the most carefully tested AI technology, and a significant fraction of what we know about commercial implementations today has been gained by testing supported by DHS S&T.”
That said, “We plan to ensure deployed system performance with the same rigor and discipline that we have previously used to assess commercial face recognition technologies and systems including measuring efficiency, accuracy, and equitability.”
Miami, Fla., Assistant Chief of Police Armando R. Aguilar: His department uses AI policing tools extensively. “We use gunshot detection systems, public safety cameras, facial recognition, video analytics, license plate readers, social media threat monitoring and mobile data forensics.” They complement officers.
“The perception among the community is that the police are, at best, unable to keep them safe; or at worst, unwilling to. Artificial intelligence bridges that gap by allowing law enforcement to solve and prevent crime and to protect our most vulnerable communities.”
Diane Sabatino, an acting executive assistant commissioner in U.S. Customs and Border Protection: “CBP’s use of biometric facial comparison technology standardizes, automates and enhances manual processes, which strengthens security, efficiency and accuracy and allows CBP officers to focus on threat detection and situational awareness.”
The department doesn’t track race, but it does monitor country of citizenship, gender and age. In that regard, CBP claims “an average technical match rate of 99.4 percent on entry and 98.1 percent on exit.”
Patrick Grother, biometric standards and testing lead at the National Institute of Standards and Technology: NIST’s role is obvious in the biometric community. It’s probably the biggest single factor in government’s acceptance of facial recognition.
“In 2024, NIST will resume our Face in Video Evaluation Program (FIVE) to assess the capability of facial recognition algorithms to correctly identify or ignore persons appearing in video sequences.”
Hoan Ton-That, CEO of facial recognition developer Clearview AI: “In the NIST 1:N Face Recognition Vendor Test, Clearview AI’s algorithm found the correct face out of a lineup of 12 million photos at an accuracy rate of 99.85 percent.”
“Every positive identification made with Clearview AI’s technology is also an instance in which a misidentification from reliance on eyewitnesses was prevented. This is the other positive side to facial recognition that is not talked about.”
Article Topics
biometrics | CBP | Clearview AI | DHS | facial recognition | NIST | U.S. Commission on Civil Rights | U.S. Government | United States
Comments