Digital rights group warns against normalizing and committing to facial biometrics to stem pandemic
Massive investments in facial biometrics capabilities and infrastructure to screen individuals and stop the spread of the coronavirus cannot be easily dismantled when the crisis ends, and risk the establishment of a pervasive surveillance apparatus, The Electronic Frontier Foundation (EFF) warns in a new article.
The race to implement new biometric technologies could lead government agencies to team up with “some of the most nefarious surveillance technology vendors in the world,” citing Clearview AI, which has reportedly been in talks with state agencies, as an example.
The digital rights group notes calls by it and other groups to stop government use of facial recognition, and says CBP is promoting facial recognition as a more hygienic way to go through airport checks. The EFF claims that “cameras, software, and open-ended contracts with vendors” required by the technology are not easy to dismantle, and warns against its normalization.
Facial recognition is “a deeply flawed technology,” according to the EFF, which repeats complaints that it enables constant government surveillance, could inhibit free speech and movement, and can be inaccurate. The organization also notes that the technology can now identify people wearing respiratory masks.
“It is all too likely that any new use of face surveillance to contain covid-19 would long outlive the public health emergency. In a year, systems that were put in place to track infected individuals as they moved through a city could be re-deployed to track people as they walk away from a political demonstration or their immigration attorney’s office,” the EFF cautions.
The EFF recently put out a quiz for Americans and visitors to the country to see who holds their facial biometrics, and the organization is calling for a tattoo recognition data to be deleted by the organizations that have it.
Article Topics
biometric identification | biometrics | EFF | facial recognition | privacy | video surveillance
The EFF *must* recognize the material difference between facial recognition and face authentication to avoid dangerous confusion. Face authentication – designed specifically to safely verify an individual to an account – is 1:1 verification (not 1:N); and the image data in advanced AI-driven systems with Certified Liveness Detection is made irrelevant by converting to a 256-bit encrypted data file (FaceMap) and immediately deleting the liveness check data (half of what is required to access an account). Any subsequent access attempt (e.g., after a hack), will simply not be allowed. Not making a distinction could significantly inhibit the adoption of just the access management tech we need right now to help keep millions of people safely connected.