Blanket bans on face biometrics use by private sector unhelpful, digital privacy group argues
Robust regulation can ensure private sector use of facial recognition that balances people’s freedom to use biometric technology for phone unlocking and other applications with its risks, The Electronic Frontier Foundation (EFF) writes in an article explaining the group’s position.
While facial recognition use by public or private sector groups alike can exacerbate injustice, according to the EFF, informed opt-in mandates and data minimization, with Illinois-style enforcement through a right of private action can protect people from potential harms while enabling the technology’s use.
The post points out the risks biometric data can pose through their permanence, through applications that chill fundamental freedoms, and through amplification of racial disparities.
In light of these issues and in contrast with its position on biometrics use by private entities, EFF supports a ban on government use of facial recognition, Senior Staff Attorney Adam Schwartz and Associate Director of Community Organizing Nathan Sheard write. The EFF has supported numerous local bans, California’s three-year moratorium on police body cameras with biometrics, and launched its About Face campaign in 2019.
The article considers the promise and peril of emerging and digital technologies in general, but notes “It does not follow that all private use of face recognition technology undermines human rights.”
The EFF claims passwords provide stronger security for mobile devices than facial recognition (neglecting to include the qualifier “strong” before “passwords”), and points out that a ban on private use could prevent the kind of demonstrations that have sought to highlight problems with the technology, sometimes with questionable logic.
Laws that should be enacted to protect people from misuse of face biometrics in the private sector include BIPA-style informed consent requirements and data sharing restrictions, retention limits, biometric data reselling bans, and secure storage rules, the group argues.
Where the private and public sector overlap, the EFF argues government entities should be barred from contracting private facial recognition capabilities. EFF successfully urged to have the restrictions on facial recognition recently enacted in Boston amended to allow private actors to use the technology in public spaces, such as for demonstrations in public parks.
The organization also notes that U.S. vendors have sold surveillance technologies to authoritarian governments in the past, and urges a ‘KYC’ standard to work with the Foreign Corrupt Practices Act and existing export regulations to prevent overseas abuse of face biometrics.
“The democratization of facial recognition”
Facial recognition is currently being used by private individuals to provide information to law enforcement about people wanted for federal crimes, meanwhile.
Both the potential risks of and usefulness of facial recognition can be seen from the use of the technology by private individuals to analyze videos of the U.S. Capitol riot shared on the social media platform Parler, as reported by Vice.
One “technologist” told Vice that he or she has processed roughly 40,000 facial images from 900 videos taken from the platform before its hosting contract was terminated for repeated policy violations. Data from a spreadsheet built by the individual, including a unique identifier for each person identified in the videos, a timestamp and location for each image of the person it is associated with, has been shared with the FBI.
Vice refers to the efforts by members of the public as “the democratization of facial recognition.”
A website called ‘Faces of the Riot’ has also been created with images purportedly from the Parler videos. That site does not use facial recognition, however, only facial detection, according to one of its co-creators.