DHS biometric privacy test of face-obscuring AI is more of a pop quiz

Perhaps noting the visibility that the U.S. federal agency NIST has gotten with its public AI-bias evaluations, Homeland Security’s Science & Technology Directorate last year announced its own tests for autonomously obscuring faces captured on video.
The results of the facial recognition program, published this week, are significantly less useful than the National Institute of Standards and Technology’s tests of industry and research algorithms.
A video showcasing the release of results spotlights motion blur and crowds as two of the most challenging video conditions for AI. Of course, it might be difficult to find someone making or buying biometric systems who could not have tipped off Homeland Security to those findings in advance.
The demonstration was announced early last year.
Unlike NIST, Homeland Security leaders did not name the five software applications that they said they chose from entrants “around the globe.”
Nor does the department say how many applications were submitted, saying only industry response was “enthusiastic.” That lack of definition makes gauging the state of the art impossible.
Presumably, some of the algorithms examined by Homeland Security are being considered by public and private organizations. Almost any other information describing results likely would be a useful factor to consider when hunting for the most effective biometric security systems that also protect people’s privacy.
The agency notes that a range of camera implementations were used in the tests, and that while the privacy tools generally worked well, some failed to obscure faces that were partially visible or at an angle.
A quick list of useful datapoints would include company name and home nation, cloud or edge, price and proprietary or open architecture. Applying a score to performance would seem a natural metric for would-be buyers, too.
Instead, Homeland Security summarized its findings by focusing on a number of conditions that can confuse obscuring algorithms. And each anonymous vendor was judged as either satisfactory at dealing with a condition (noted in the color “teal”) or as needing improvement (“orange”).
Binary, vague judgments are kinder to underperforming vendors than they are useful to the industry at large, buyers or citizens.
It does not help that the department chose to display the results online in a way that tries to be clever at the expense of usability. Scrolling causes features to confusingly appear and disappear.
NIST’s biometric reports can sometimes have the graphical flair of a meat tenderizer, but the data stays put on a page. A prime example is the agency’s iconic Face Recognition Vendor Test, which is transparent and detailed.
Article Topics
AI | airports | algorithms | biometric identification | biometric testing | biometrics | Department of Homeland Security | DHS | facial recognition | privacy
Comments