FB pixel

DHS biometric privacy test of face-obscuring AI is more of a pop quiz

DHS biometric privacy test of face-obscuring AI is more of a pop quiz

Perhaps noting the visibility that the U.S. federal agency NIST has gotten with its public AI-bias evaluations, Homeland Security’s Science & Technology Directorate last year announced its own tests for autonomously obscuring faces captured on video.

The results of the facial recognition program, published this week, are significantly less useful than the National Institute of Standards and Technology’s tests of industry and research algorithms.

A video showcasing the release of results spotlights motion blur and crowds as two of the most challenging video conditions for AI. Of course, it might be difficult to find someone making or buying biometric systems who could not have tipped off Homeland Security to those findings in advance.

The demonstration was announced early last year.

Unlike NIST, Homeland Security leaders did not name the five software applications that they said they chose from entrants “around the globe.”

Nor does the department say how many applications were submitted, saying only industry response was “enthusiastic.” That lack of definition makes gauging the state of the art impossible.

Presumably, some of the algorithms examined by Homeland Security are being considered by public and private organizations. Almost any other information describing results likely would be a useful factor to consider when hunting for the most effective biometric security systems that also protect people’s privacy.

The agency notes that a range of camera implementations were used in the tests, and that while the privacy tools generally worked well, some failed to obscure faces that were partially visible or at an angle.

A quick list of useful datapoints would include company name and home nation, cloud or edge, price and proprietary or open architecture. Applying a score to performance would seem a natural metric for would-be buyers, too.

Instead, Homeland Security summarized its findings by focusing on a number of conditions that can confuse obscuring algorithms. And each anonymous vendor was judged as either satisfactory at dealing with a condition (noted in the color “teal”) or as needing improvement (“orange”).

Binary, vague judgments are kinder to underperforming vendors than they are useful to the industry at large, buyers or citizens.

It does not help that the department chose to display the results online in a way that tries to be clever at the expense of usability. Scrolling causes features to confusingly appear and disappear.

NIST’s biometric reports can sometimes have the graphical flair of a meat tenderizer, but the data stays put on a page. A prime example is the agency’s iconic Face Recognition Vendor Test, which is transparent and detailed.

Article Topics

 |   |   |   |   |   |   |   |   | 

Latest Biometrics News


Australia, Nigeria announce moves to ease digital birth registration

Governments in Australia and Nigeria are working on digital birth registration to make it easier for parents to qualify their…


Biometric video injection attacks getting easier; ID R&D helps devs mitigate

Through the use of generative AI and open-source tools, hackers are gaining the ability to easily create deepfakes and voice…


Innov8tif patents document authenticity check method to boost IDV security

Smartphones play a central role in remote identity verification (IDV), enabling a host of advanced functionalities that compliment biometrics, including…


Idemia and Iowa collaborate on mDLs in Samsung Wallet

Idemia is bringing mobile ID to Samsung Wallet in Iowa, in collaboration with the state’s Department of Transportation (DOT). The…


UNHCR to seek provider for BIMS lightweight fingerprint and iris scanners

Biometrics firms should be aware of a forthcoming procurement opportunity with the United Nations High Commissioner for Refugees (UNHCR), which…


IDnow and Idiap researchers create biometric PAD dataset for better generalization

A new dataset for conducting research on facial recognition presentation attack detection (PAD) has been developed by a team of…


Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Read This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events