DHS releases ‘comprehensive’ report on use of face biometric systems

The U.S. Department of Homeland Security’s (DHS) new report on the department’ use of facial recognition (FR) and face capture (FC) technologies underscores the importance of maintaining a balance between security and civil liberties in the age of artificial intelligence.
The report highlights that as DHS continues to deploy FR/FC technologies, it faces the dual challenge of maximizing their potential while minimizing their risks. Future efforts will likely focus on improving algorithmic fairness, strengthening data protection measures, and enhancing public awareness about the benefits and limitations of these systems.
The report was released on the heels of renewed scrutiny of the use of FR/FC across the DHS enterprise. Earlier this month, House Committee on Homeland Security Chairman Mark E. Green and Rep. Carlos Gimenez, chairman of the Subcommittee on Transportation and Maritime Security, formally “requested a detailed review from the Government Accountability Office on the Transportation Security Administration’s (TSA) implementation of biometric identification and use of AI-driven technology in its homeland security mission.”
Other lawmakers have taken similar recent actions regarding their concerns about privacy, civil liberties, and the efficacy of using these biometric technologies, reflecting a growing bipartisan effort to scrutinize and regulate the use of advanced technologies by federal agencies.
Nevertheless, DHS says that by committing to ethical innovation and transparent governance, it aims to demonstrate that privacy and security can coexist in the digital era. However, the broader societal debate on the role of surveillance technologies in public life remains ongoing and it is imperative that policymakers, technologists, and citizens engage in this discourse to shape a future that respects both individual rights and collective security.
The report, required by DHS Directive 026-11, constitutes an analysis of FR/FC use cases and provides a comprehensive overview of the benefits, challenges, and regulatory safeguards associated with FR/FC systems. The directive established an enterprise policy for the authorized use of FR and FC technologies by DHS, which says the “report presents more information than ever previously shared about how we use and govern these technologies.”
The report encompasses completed performance reviews of eight priority uses of FR/FC based on direct testing, analysis of operational reporting statistics, and reviews of third-party testing results, and analyzed demographic differentials where possible.
“Face recognition technology can be controversial, and when used improperly, it can cause real harm. That’s why in 2023, DHS implemented the most stringent requirements of any federal agency for how FR can be used and how it must be tested,” said Eric Hysen, former DHS CIO and Chief AI Officer.
Hysen said the report shows “our FR systems performed extremely well for diverse demographic groups. For fully operational systems, like ID checks for travelers at airports and ports of entry, the technology worked more than 99 percent of the time. And when minor issues were identified, we acted swiftly to address them.”
“We’re not perfect,” Hysen added, “but this report highlights how FR is delivering real value for the public and supporting critical law enforcement missions while maintaining our commitment to transparency, accountability, and responsible AI use.”
“Overall, FR/FC systems performed extremely well for diverse demographic groups,” the report says, noting that “on average, the technology worked more than 99 percent of the time for systems that are fully operational, like ID checks for travelers at the airport and ports of entry to the United States.”
Facial recognition and face capture technologies have become instrumental in enhancing efficiency and security across DHS operations. Used for identity verification at airports, border crossings, and during criminal investigations, these technologies promise improved public interactions and more robust law enforcement capabilities. FR/FC technologies automate processes traditionally reliant on manual verification, such as matching traveler identities to passports, thus reducing human error and increasing operational speed.
For example, the TSA’s deployment of the Credential Authentication Technology with Camera System (CAT-2) allows for rapid passenger verification at security checkpoints. This system employs a one-to-one facial matching protocol, comparing a traveler’s live photo to the one on their ID. On average, these processes take only seconds, ensuring operational efficiency while reducing the risk of fraud or impersonation.
However, the deployment of FR/FC technologies inherently raises privacy concerns, especially regarding biometric data collection, storage, and usage. DHS has implemented policies to address these concerns, including the deletion of U.S. citizens’ facial images within 12 hours of capture. Non-citizens’ images are stored for longer periods, often up to 14 days, but are governed by strict retention and access controls. Data from FR/FC processes, particularly those involving sensitive populations such as minors, are handled with heightened care, DHS says.
DHS also emphasizes transparency in its practices, pointing out that individuals can opt out of non-enforcement use cases of facial recognition such as during airport check-ins without penalty. Signage and public notices aim to ensure that individuals understand how their data is being collected and for what purpose. DHS also engages in public consultation to refine its policies and practices continually.
A recurring challenge with facial recognition technologies is the potential for algorithmic bias, the report acknowledges, pointing to studies that have shown that some systems perform less accurately with certain demographic groups, such as those with darker skin tones. DHS says it has taken steps to mitigate this recurring problem, including rigorous testing. These evaluations assess demographic differentials in FR/FC systems, helping to identify and address disparities in accuracy, DHS said.
For instance, the TSA’s CAT-2 system demonstrated uniform performance across various demographic groups, achieving over 99 percent success rates in both face capture and matching processes. Other systems, like the Global Entry Touchless Portals, revealed minor variations in accuracy based on skin tone and age, prompting DHS to continue monitoring and refining the algorithms to ensure equitable performance.
The report states that testing revealed that for TSA PreCheck’s prototype Touchless Identity Solution, “the face matching worked well,” but that there were “issues with the face detection algorithm used to verify if a photo contains a face before matching. This algorithm was accurate 88 percent to 97 percent of the time, with performance varying based on skin tone and self-reported race, gender, and age. To address this, TSA quickly introduced a manual photo capture step, which only adds 2-3 seconds to the process and does not affect the overall screening experience.”
DHS said TSA and the department’s Science and Technology Directorate are evaluating new algorithms to improve this step and plan to test and implement them later this year.
Additionally, the report says that “two other minor trends in test results” were noted “that will be monitored going forward.”
For some Customs and Border Protection use cases, the report says, “there were very small differences in measured face matching performance based on skin tone and self-reported race and age, ranging from less than 1 percent to 2-3 percent. Face matching still performed well overall, and the lowest success rate for any demographic group was 97 percent. This round of testing was only designed to reliably detect differences of 5 percent or greater, so we can’t say if smaller measured differences reflect true underlying differences in performance. We will continue to monitor these trends, refine our testing practices, and take action as appropriate.”
The use of facial recognition extends beyond travel and border security. Homeland Security Investigations (HSI) employs this technology to combat child sex exploitation and abuse. By leveraging tools like Clearview AI, DHS said HSI investigators can identify victims and perpetrators more efficiently. However, DHS has established stringent protocols for this sensitive use of FR/FC to ensure that the technology is used only after exhausting other investigative techniques.
DHS says HSI’s use of FR/FC technologies is carefully circumscribed to protect civil liberties. Leads generated through facial recognition are subject to rigorous human review and corroboration with additional evidence before any enforcement action is taken. These safeguards aim to balance the imperative of rescuing vulnerable populations with the need to uphold individuals’ rights to privacy and due process.
Public trust is critical to the successful implementation of FR/FC technologies. Recognizing this, DHS has sought to provide avenues for redress and feedback. Travelers who experience issues with FR/FC systems can contact the DHS Traveler Redress Inquiry Program or file complaints with the Office for Civil Rights and Civil Liberties.
Additionally, DHS says it regularly collaborates with privacy advocates, civil rights organizations, and technology experts to review its policies. This collaborative approach not only enhances the credibility of the agency’s initiatives but also ensures that its practices evolve in response to technological advancements and societal expectations.
Article Topics
biometric testing | biometric-bias | biometrics | CBP | demographic fairness | DHS | face biometrics | facial recognition | TSA | U.S. Government
Comments