Portland face biometrics ban proposal reflects basic misunderstanding of the technology, IBIA says
Draft bills under consideration by the City of Portland to prohibit biometric facial recognition from being used by public and private entities are based on a definition of the technology which is unsupported by science or experts, according to the International Biometrics + Identity Association (IBIA).
The rationale the ordinances are based on is unsupportable, the IBIA writes in its comments on the draft bills. The group argues for that conclusion based on NIST testing showing undetectable differences in accuracy between demographic groups in top performing algorithms, the benefits of facial recognition, the risks of an open-ended moratorium on the technology’s use to public safety and national security, and the misunderstanding of facial recognition shown by its definition in the bills.
Portland City Council has sought public comment on the draft ordinances, which were delayed from November, 2019 to this year, then again to June 15. It is yet to be enacted, and has faced pushback from groups including Amazon even as it has expanded from its original focus on local government agencies.
“Face Recognition Technology means an automated or semi-automated process that assists in identifying, verifying, detecting or characterizing facial features of an individual or capturing information about an individual based on an individual’s face,” one of the ordinances reads.
IBIA provides more than a dozen points on how to correctly interpret the NIST test, saying Portland City Council appears to have ignored key takeaways from the report, while declining to reveal sources for its claims of routine bias in the technology. The finding that 30 different algorithms tested by NIST return less than three false non-matches per thousand queries is pointed out, and uses the example of fingerprint recognition accuracy for Asian women improving dramatically with increased scientific knowledge to support the claim that performance variations do not indicate the introduction of bias into algorithms.
Importantly, IBIA states, automated facial recognition is more accurate and less biased than the people it replaces, and also enables processes people are not capable of, such as identifying missing children who cannot say their names. The technology has also proven essential to law enforcement and public safety, which will be negatively affected by banning it, IBIA writes.
As in previous comments on proposed legislation, IBIA emphasizes the difference between facial recognition and surveillance, which are differentiated as a passive activity in the former case, and an active one in the latter.
The suggestion of gathering information about people’s characteristics from facial analysis is singled out as particularly reflective of poor understanding of facial biometrics.
“Facial recognition algorithms as a source of information about an individual’s characteristics is not science. One cannot infer emotion, patriotism, criminal inclinations, sexual orientation, or other characteristics from a mathematical template of the face. This is NOT facial recognition.” IBIA writes (emphasis in original).
“Conflating this with facial recognition only confuses the issues and will certainly preclude an informed discussion on the public safety and security benefits of facial recognition technology.”
Algorithms developed to identify characteristics like criminality and sexual orientation have been consistently panned by biometrics researchers as junk science.
The IBIA recommends the ordinances not be enacted in their current form, and that governments and businesses be allowed to use facial recognition for a wide variety of benefits.