FB pixel

Biometrics industry lobbies for responsible facial recognition regulation

Experts from NEC, FaceFirst, IBIA, Identity Strategy Partners, DHS on countering misinformation and addressing concerns
 

biometric-exit-facial-recognition-cbp-nec

What constitutes appropriate use of biometric facial recognition technology is the subject of Congressional hearings, draft legislation at the federal and state levels, debates and partial moratoriums in cities across the U.S. New rules are being considered in the EU and elsewhere, and the biometrics industry is increasingly locked into a debate that all agree is critical.

Beyond that, the terms of the debate and the facts of the matter often seem at issue.

The stakes are high, because legislation at the local and state levels has already complicated the market for businesses, and the civil liberties claimed to be under threat are foundational to the U.S. and many other nations. Legislation, therefore, is coming.

Perpetual state surveillance in service of repression is a fact of life for some people, and has itself prompted government action. Fear of its spread as part of a system that undermines personal freedoms and civil rights is being spread by some rights advocacy groups as part of a campaign against biometrics in general, and facial recognition in particular.

Privacy advocates have been very effective at crafting that narrative and raising scenarios that violate probable cause and other existing legal principles and statutes, International Biometrics and Identity Association (IBIA) Executive Director Tovah LaDier told Biometric Update in an exclusive interview.

While a number of privacy advocacy organizations have not focused on fighting facial recognition, the American Civil Liberties Union (ACLU), Fight for the Future (FFTF) and others have repeatedly argued that the technology is inaccurate, biased, and invasive. They also claim this technology, despite (and even in part due to) not working well, will power the rise of dystopia, invoking George Orwell’s fearsome creation by name. They do not, however, tend to differentiate between verification and identification, forensic and live use, or new and existing processes.

“Despite the fact that you’ve got a smaller segment of special interests that are anti-face recognition, they’re very good at lobbying and what they do with misconceptions and throwing those out there effectively,” Identity Strategy Partners Co-founder and CEO Janice Kephart explains. Kephart served as 9/11 Commission counsel, playing a key role in the portion of the report on leveraging biometrics, and has performed many public and private sector roles related to complex identity issues.

Facial recognition is not used in the surveillance apparatus described by Orwell in 1984. It does not feature in the dystopian futures presented in Brave New World, Anthony Burgess’ direct response to Orwell 1985, Zamyatin’s We, Atwood’s The Handmaid’s Tale, or indeed, hardly anywhere in the canon of the genre. It rarely even appears in Philip K. Dick, whose Minority Report is the other example often cited in consumer publications, and who made technology a focus of his stories of future oppression. The overbearing, rights-crushing surveillance systems described in these works of fiction are no less effective for being, in the main, entirely compatible with the facial recognition regulation proposed by prominent rights advocates.

Several executives of biometrics companies have told Biometric Update that smartphones, and the ability they grant service providers to track the location and activities of individuals, represent a threat to civil liberties that has already been ubiquitously deployed. This message, however, has yet to be widely received.

Instead, legislators in different jurisdictions are taking a variety of approaches toward a mixed bag of results. How it will all end up is an open question at the moment, one of critical importance to all stakeholders.

“Facial recognition is having a moment. It is maturing. It has matured dramatically in the past five years,” NEC Vice President of Washington D.C. Operations and Federal Business Benji Hutchinson tells Biometric Update. “Along with that comes a little bit of a latency in the public policy area because laws are notoriously slower than technology to be adopted and put in place. We’re seeing that vigorous debate happen right before our eyes.”

The industry is now attempting to find its voice and coordinate, but the process is at an early stage, and so far, that voice has been weak, according to Kephart.

What is the message?

Facial recognition policy will eventually have to deal with technologies, use cases and situations which do not involve the same concerns. Real-time CCTV deployments by law enforcement are just one in a galaxy of ways organizations are currently working to leverage the technology, most of which are both less controversial and closer to fruition. Facial biometrics are about to disrupt several different sectors of the economy, FaceFirst CEO Peter Trepp predicts in an interview.

“In a few years, that’s going to be the norm. Credit cards are going away, and ATM cards are going away, and lines are going away,” Trepp says. “We can board a plane now in half the time using facial recognition; guess what, people don’t want to stand in line, of course they’re going to do that. There are going to be questions. Amazon Go is coming out, three thousand stores and no cashiers, no checkout lines, and every retailer that we’re dealing with recognizes that we don’t want to have lineups. There’s a convenience to customers but also customers see companies like that, as taking care of their customers.”

What this means from a practical standpoint is that ‘facial recognition’ refers to a basket of technologies, including 1-to-1 facial verification and 1-to-N facial identification, implemented in a wide variety of different ways. These different kinds of implementations do not share the same benefits and risks. A key element of the message the industry must get across to policy-makers and the public, therefore, is that people mean different things when they use the term “facial recognition,” but the version its most strident opponents describe is not one of them.

The actual unaddressed policy issues are largely around this kind of private sector implementation. Public sector deployments of facial recognition often receive more media attention, but are already subject to significant rules, Department of Homeland Services Deputy Executive Assistant Commissioner John Wagner insists in an interview.

He makes a comparison to radar guns, which require calibration and officer training, though Wagner also acknowledges certain grey areas. Wagner believes the difference is not sufficiently understood by policy makers, but public deployments of the technology are significantly different than what Customs and Border Protection is doing with Biometric Entry/Exit.

“It’s unclear I think, and the courts or congress probably need to rule on this, what that constitutes as far as if a law enforcement officer stops you on the basis of a match. What level of suspicion does that rise to or equate to, and can that cause a law enforcement response?” he asks. “When you talk about policy those are all the things that still need to be discussed.”

For CBP, facial biometrics are used in the Biometric Entry/Exit program to replace a manual process that has been mandatory for decades with an automated one, and therefore the agency already has the authority, as well as rules in place, to perform the process.

CBP and other government agencies likewise have processes and controls in place that are unknown to most people, and likely more stringent than many would realize.

“There’s no shortage of requirements that we have to publish; Privacy Impact Assessments, Systems of Record Notices,” Wagner points out. “We have to be clear with people about the data we’re using, keeping, storing, for how long, on what conditions. All those things have been published. We’re working on regulations to further that description of what we’re doing.”

Trepp also notes that existing laws, including newly imposed restrictions and recent proposals, are not well understood by the media and public. The “ban” in San Francisco, for example, has no direct bearing on most uses of facial recognition technology at all. The regulation there is a temporary measure that applies only to the use of the technology by law enforcement agencies, and only municipal ones at that.

This discrepancy may be more crucial than the industry has recognized thus far, as it could convey (and is certainly often used to suggest) the false impression that the authorities who have most closely studied the matter have tended to approve blanket bans.

Another key message is that facial recognition is treated differently than fingerprints and DNA by the legal system. This means that not only are forensic facial recognition matches probabilistic rather than deterministic, and passed to a human examiner to make the judgement, the results are also not admissible as evidence in court.

Proponents of blanket bans on facial recognition have repeatedly claimed that the technology is inaccurate.

“What they have done is used a lot of misleading information, which has been publicly debunked, for instance by NIST, but they continue to use it and it continues to be believed by a certain number of lawmakers,” explains LaDier.

“But it’s not only the inaccurate facts, it’s that they have been masterful in creating a perspective, that somehow facial recognition is really dangerous in all these hypothetical situations in the future if we start down the road of using it, so we have to ban it now,” she continues. “The common refrain that you hear is that we’re going to become like China. What they ignore is the fact that it’s not the existence of the technology, it’s what are the values of the country and the people, and how do they decide they want to live.”

LaDier also notes that the ACLU has conflated terminology in its description of facial biometrics, and continues to use the terms in the same way. Along the way, terminology has become an even broader point of contention for the biometrics industry, adding emotional weight to allegations about low accuracy for women and people with dark skin.

“One of my goals has been to change the terminology,” LaDier explains. “But everybody keeps using the term ‘bias’ because its short, rather than saying the whole phrase.”

What people are talking about is really performance across different populations, she says. It could also be called demographic disparity, but however it is labeled, it is nearly eliminated by several leading vendors’ algorithms in NIST testing on the subject. Since the report was released, LaDier has noticed a shift towards more scientific language among some publications.

A task for the industry, then, is to force vendors in the market to meet acceptable performance standards for their chosen area of application, and make clear to policy-makers and the public that a problem among legacy and some of the non-commercial algorithms tested by NIST does not apply to the technology used by governments or businesses.

Technical and ethical standards, best-practices and ultimately government regulation can all play a role in assuring anxious members and representatives of the public. This means not only that the industry needs to continue working with other stakeholders to formulate guidelines that are acceptable to society and reflect its values, but also that those frameworks already in use should be communicated to those who have concerns, and are willing to be educated.

The disconnect between the “wild west” environment portrayed by some rights activists and the tightly controlled reality for DHS also reflects what LaDier describes as an inherent challenge for biometric technology proponents, which is that their messages are often much more technical in nature, and therefore can lack the emotional impact of simple appeals to fear based on misunderstanding. This challenge applies to how applications work and how they are controlled, and to an even greater extent to messaging about effectiveness and accuracy.

Demographic disparities or “bias” problems are ultimately accuracy problems, and the drastic improvement in the accuracy of algorithms is in large part due to advances in machine learning, but also in part due to improvements in datasets. IBM released its “Diversity in Faces” dataset specifically to address potential “blind spots” in algorithms, but a subsequent controversy over its methods of building the dataset highlight the difficulty of efficiently closing the remaining gap in error rates. Further, new research indicates that improving datasets may never make facial recognition equally accurate for different genders.

Now that NIST has quantified the extent of demographic disparity, the low numbers of people who will be affected by remaining demographic disparities, and what a failure to match their biometrics will actually mean, can be presented by the industry as evidence that for existing use cases, the technologies used by the U.S. government are more accurate than any alternative on offer.

The message that the industry needs to get across to the public and policy-makers in the short term, then, is that facial recognition means different things, it works, and standards are important.

How to get the message out

Some facial recognition providers have been communicating with policy makers for years, but not many, and not in a coordinated fashion. Messaging by the industry to the public has been extremely limited, though over the last couple of years a few thought leaders have been interviewed by consumer publications, and the appetite among consumers and the media that serves them for information on biometrics has never been greater.

The receptivity of the audience to any message matters a great deal, and receptivity appears to be significant among the public, with 56 percent of U.S. adults in support of police use of facial biometrics, and nearly half “excited” to use it in retail settings. Policy makers are also largely receptive, according to NEC Director of Government Relations Brent Bombach.

LaDier says that some in government have a relatively high degree of understanding of facial recognition’s benefits and risks, such as members of the House Homeland Security Committee, who have been working with DHS on the technology’s implementation since before it burst out into popular debate.

Most lawmakers, at least at the federal level, are sincerely attempting to learn more in order to make properly informed judgements, Bombach asserts.

“We’ve found it extremely helpful just to be present to start answering questions,” he says. “There’s quite an appetite for understanding the technology out there, and what we do is spend our time on Capitol Hill and with others here in DC educating them about the true state of the technology and its applications, and in many cases directing them to others who are even better resources, like NIST and others on the government payroll who are true experts in biometric technology.”

Despite a few bills being proposed, Congressional action does not seem imminent to Hutchinson. The federal level is where most efforts, including from groups like IBIA are focused, however, and where many stakeholders on both sides are looking for a single set of rules.

In the meanwhile, some states have been moving faster. Washington State recently took a step towards passing a data privacy bill with major implications for facial recognition companies, and proposals were considered in at least 26 states last year, making it necessary for the industry to split its engagement with policy-makers between two levels.

“Many of our major companies are very active in the states, so I’m beginning to feel a consensus developing that we somehow have to find a way to focus more on what’s going on at the state and local level,” LaDier says. “But in the meantime, the real fight for us on Congress versus state and local is we desperately need a federal pre-emption provision for whatever Congress does on biometrics, facial recognition, or big data in general. That federal pre-emption is really critical for us. We can’t have 50 states each having different laws and different rules and regulations, and some have private causes of action and some don’t.”

The uncertainty around state level legislation, and the risk of a patchwork of rules, is emphasized by Hutchinson of NEC.

“The unfortunate is that we’ve seen this before,” he says. “We saw this in the 1990s and 2000s when DNA started to roll out across the United States. There was no federal pre-emptive law that was put on the books for standards around how and when that DNA is collected, so the states started implementing their own laws, and now if you look at that, there are a bunch of different laws at the state level that give law enforcement agencies, for example, guidance, and its very inconsistent across the country. We suspect that unfortunately that could be the way that facial recognition ends up.”

Federal pre-emption, if it is going to happen, will likely come through data privacy legislation.

It is urgent for supporters of facial biometrics to make the case against overly restrictive legislation at multiple levels of government, but there is only so much companies developing and delivering the technology can do. The need for the government’s own resources and expertise to communicate the benefits of the technology, and the controls around it, is brought up repeatedly by industry stakeholders, who are not themselves able to fully contend with accusations that the technology is not as advertised, since they are the ones paying for any advertising.

Federal resources include people who have the most subject matter expertise, available in-house, according to Bombach.

“We need to be much clearer with the public and with the Hill on exactly what we’re doing, how we’re doing it and why we’re doing it,” Wagner admits. “I think we need to answer all of the questions and the concerns that are out there, and there are various formats we do that in.”

Police have likewise been supportive of facial recognition technology. Trepp points out that the media virtually ignored comment by the Chief of New York City Police about needing facial recognition to speed up and improve existing processes that would otherwise be done manually. Headlines do not generally mention these kinds of testimonials, leaving most people unaware of them.

Inconsistent implementation of best practices and occasional misuse of the technology, on the other hand, such as alleged use of police sketches, threaten to undermine the value of those frameworks. Kephart observes that as the market has opened up, some relatively new entrants seem not to have grasped the importance of adhering to strict standards right away.

Positive impacts are starting to roll in from many of the trials and early uses in the U.S. and around the world, including CBP data showing high accuracy without significant demographic differences leading to faster boarding processes, and children rescued by police from abuse. Those good-news stories can be shared proactively and repeated to balance the repetition of outdated or out-of-context talking points, and illustrate the key messages facial recognition proponents want to get across.

Co-ordination and the multi-stakeholder approach

Hutchinson notes recent efforts by the IBIA, IDTA (Identity Technology Association), SIA, and Chamber of Commerce, all of which he says are starting to really step up in the past five months. This includes issuing best practices and white papers, and LaDier adds communicating with the media, and doing so on a consistent basis.

Neither the identity industry nor federal agencies have traditionally had strong voices, according to Kephart. It is in the biometrics community’s best interest to set quality standards and then to conduct testing, and NIST is established in the latter role for the community. The community mostly focused on those efforts, without taking on other efforts such as raising awareness among policy makers or the public.

“They haven’t filled the void with their own voice because they always let the technology speak for itself.”

In attempting to raise the level of education on the topic, LaDier has found that small groups, or round-tables of a couple dozen people or less allow questions to be asked and answered patiently. What needs to improve, she says, is the industry’s coordination around its key points and the best way to deliver them. Companies working with facial recognition, and biometrics more generally, must take up a constant rhythm, Hutchinson says, and echoes LaDier on collaboration.

“I think one of the things we have struggled with, but we’re getting better, is working as a coalition,” he states. “I think for a while when the facial recognition topic started to really surge, we may have responded as quickly as could have. It’s a big industry, a global industry, so there’s reasons for that, but I think collectively we’re getting better at it now.”

Trepp wrote his book “The New Rules of Consumer Privacy: Building Loyalty with Connected Consumers in the Age of Face Recognition and AI” as a way to organize his thoughts on the matter and move the conversation forward. He says that since FaceFirst was invited to Capital Hill to consult on draft legislation, the company has remained committed to informing lawmakers.

“We’ve been on call since, and we’ve been invited back, and we hope to continue to have ongoing dialogue.”

Now, federal agencies have realized the urgency of making their argument clear, even among the urgent priorities that make up their mandates.

“When it starts taking the turn of potential legislation, that gets everybody’s attention,” Kephart observes. “They don’t care nearly as much up to that point. Someone may be complaining about something, but they’ve got a job to do.”

This is also true of states, and Hutchinson says NEC is increasingly engaged at that level.

The NIST report on demographic disparity represents a major step forward in dealing with the concerns of people just learning about the technology, and Patrick Grother, who leads NIST’s facial recognition testing, has been working to educate the public, LaDier says. A dominant media narrative emerging from the report, however, has been that most algorithms tested were found to have significant disparities. Headlines that could change minds in large numbers have yet to be produced, but the industry can use the data provided by NIST to back up its arguments, in a simple example of the kind of teamwork that can be convincing.

Airport deployments are another example, in which biometrics providers, government partners, and private industry partners from the aviation industry can share their findings together to build a counter-narrative based on actual experience.

The federal government must continue to find its voice, both in terms of sharing application and testing data and communicating about safeguards. The industry must be clear about what facial recognition systems are doing, as well as the standards and best practices already used, given the common belief that already address the main concerns of the technology’s chief opponents. Compromise may be necessary.

So long as that compromise is made based on informed efforts to protect society against real threats, be they to physical safety or personal liberties, the technology can be used to improve people’s lives, even while the decline of personal privacy is reversed.

Article Topics

 |   |   |   |   |   |   |   |   |   |   |   | 

Latest Biometrics News

 

Singapore banks to roll out Singpass face verification

The Monetary Authority of Singapore (MAS) and the Association of Banks in Singapore (ABS) have announced that major retail banks…

 

Chinese hacking compromised hundreds of thousands of devices containing personal PII

The US Department of Justice (DOJ) announced Wednesday that the Federal Bureau of Investigation (FBI) sought and obtained a court-authorized…

 

US needs facial recognition legislation, NIST guidance to protect civil rights: report

Facial recognition’s benefits for law enforcement and civil applications run by America’s federal government could be outweighed by its negative…

 

Nigerian leader pledges support for digital ID expansion amid DPI investment plans

Speaking through a representative at an event to mark 2024 Identity Day this week, Nigerian President Bola Tinubu highlighted the…

 

Nigerian digital ID startup Regfyl raises $1.1M to address Africa’s AML compliance challenges

Regfyl, a Nigerian digital identity verification and fraud detection startup, has secured $1.1 million in a pre-seed funding round. This…

 

Google launches synched passkeys as tech giants move away from passwords

Digital technology giants are directing their efforts toward implementing passwordless authentication, with a particular focus on using passkeys, to avoid…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Read This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events