US needs facial recognition legislation, NIST guidance to protect civil rights: report
Facial recognition’s benefits for law enforcement and civil applications run by America’s federal government could be outweighed by its negative impact on civil rights if the right safeguards are not introduced, according to the U.S. Commission on Civil Rights.
The new 194-page report on “The Civil Rights Implications of the Federal Use of Facial Recognition” acknowledges the usefulness of facial recognition for federal agencies, but also raises a range of concerns, and identifies several areas where improvement is needed to safeguard against civil rights violations.
The Commissioners heard from many stakeholders and experts, heavily citing Patrick Grother of the National Institute of Standards and Technology in the report, and visited DHS’ Maryland Test Facility to learn about biometrics testing.
The report considers how facial recognition is used by the Department of Justice (DoJ), the Department of Homeland Security (DHS) and the Department of Housing and Urban Development (HUD).
The Federal Bureau of Investigations and U.S. Marshals Service are the primary users of facial recognition among DoJ agencies, while DHS uses the technology in its operations to increase safety and deliver services more efficiently. HUD uses facial recognition integrated with security cameras in federally-funded public housing, availing itself of the technology via the Emergency Safety and Security grants.
IntelCenter provides a facial recognition service dedicated to terrorism investigations, and Marinus Analytics provides one dedicated to identifying victims of human trafficking. Thorn is used to find sexually exploited children and identify their traffickers and abusers.
On the civil use side, Idemia’s facial recognition is used by the TSA PreCheck program.
Clearview AI provides a more general law-enforcement service which has been used by five agencies, but the Bureau of Alcohol, Tobacco, Firearms and Explosives (ATF), the Drug Enforcement Administration (DEA) and Secret Service had all discontinued its use, as of April 2023.
The FBI, USMS and the Child Exploitation and Obscenity Section of the Criminal Division are the only DoJ agencies now using facial recognition, the Department told the Civil Rights Commission. CBP has implemented face biometrics at all international airports for entry and 53 for exits, plus 40 seaports, as well as all pedestrian crossing points on the northern and southwest borders.
DoJ and DHS have put interim policies in place to govern their use of facial recognition, but HUD does not track its use.
Concerns raised
Civil rights concerns raised in the report include the variations in accuracy found between different developers, as well as between different races and genders, which open the door to discriminatory practices that violate civil rights. The technology is also easy for people with inadequate understanding to use, the Commissioners found. They also spotlight a lack of standards, from development of the core technology to federal government policy for using it.
The lack of policy standards, training and oversight represent a troublesome gap that the Commission suggests could be addressed by Congress.
“Unregulated use of facial recognition technology poses significant risks to civil rights, especially for marginalized groups who have historically borne the brunt of discriminatory practices,” said Rochelle Garza, Chair of the U.S. Commission on Civil Rights. “As we work to develop AI policies, we must ensure that facial recognition technology is rigorously tested for fairness, and that any detected disparities across demographic groups are promptly addressed or suspend its use until the disparity has been addressed.”
Commissioner Mondaire Jones notes the concerns with accuracy, oversight, transparency, discrimination, and access to justice related to the use of facial recognition.
Recommended actions
The Commission recommends that NIST stand up an operational testing protocol for agencies to use when setting up their deployments. Congress should tie funding to the adoption of training standards for people review and analyzing facial recognition results, and establish a statutory redress mechanism for those harmed by the technology’s misuse.
Chief AI Officers should encourage national training standards, mitigate demographic disparities and consult with affected communities to inform agency decisions. They should use MdTF as a template for real-world testing, the report says.
Agencies using FRT should have a policy, and make it publicly available. Vendors should provide ongoing training, support and updates to ensure high accuracy for different demographic groups in real-world deployments.
Bodies receiving federal grants and using facial recognition should refer to accuracy and demographic disparities found in NIST’s FRTE or equivalent testing.
The final word in the report goes to Commissioner Glenn Magpantay, who states, “It is time for Congress to act on artificial intelligence and facial recognition technology.”
Article Topics
biometric-bias | biometrics | demographic fairness | facial recognition | NIST | standards | U.S. Commission on Civil Rights | U.S. Government
Comments