FB pixel

With biometric lawsuits increasing — what regulations can protect our privacy?

With biometric lawsuits increasing — what regulations can protect our privacy?
 

By Rob Shavell, Co-founder & CEO of Abine / DeleteMe

Biometric lawsuits are happening everywhere. From fast-food chains and pork producers to logistics companies and tech giants like Google, countless companies are currently facing legal action over how they collect, store, and use customer and employee biometric information. In January 2021 alone, plaintiffs filed as many as 55 lawsuits. However, while public interest in biometric data privacy is undoubtedly increasing, the ongoing flurry of biometric legal action also highlights a worrying fact: current legislation is not protecting consumer biometric information.

Biometric data is becoming a “normal” part of authentication

While it was once a novelty, biometric authentication no longer surprises us. Hundreds of millions of people now use biometrics daily to unlock their devices or sign into workplaces without giving the technology a second thought. However, as our acceptance of biometric scanning grows, tech companies are finding new uses for the technology behind it and are dramatically raising the potential for its abuse.

Amazon’s ‘Amazon One’ provides a pertinent example. Introduced last year into Amazon Go stores, Amazon One is a biometric palm print scanner that allows people to pay for items by using their palm prints.

In what may be a significant negative development for individual privacy, this technology allows Amazon to link customers’ biometric data to their accounts. In time, this new trove of data will inevitably allow Amazon to personalize its offerings even further. Frederike Kaltheuner, a tech policy analyst, agrees with this point, describing Amazon One as a method for the tech giant to “fill in the gaps in its data empire” rather than the customer benefit it claims to be.

Amazon is not the only company harvesting customer biometric information. Earlier this year, TikTok updated its privacy policy to let its users know that it’ll now be collecting their “voiceprints” and “faceprints.” Disturbingly, there’s no mention of what either term means or what the company will do with the biometric information it collects. And even though TikTok says that, “where required by law,” it will seek user consent before collecting this information, the question we need to ask is: which law?

The current legal landscape

Against the growing threat that biometric data abuse poses for customers, there currently isn’t a single comprehensive federal law governing the collection and use of biometric data in the U.S. However, that is not to say that federal lawmakers are not interested in passing one.

Introduced last year, the National Biometric Information Privacy Act of 2020 (NBIPA) aims to regulate biometric data collection, disclosure, retention, and destruction. If passed, the law would require private companies to gain individuals’ consent before collecting their biometrics, like their eye scans, faceprints, voice scans, and fingerprints. Crucially, the law would also allow a private right of action.

The proposed NBIPA closely mirrors Illinois Biometric Information Privacy Act (BIPA). Passed in 2008, BIPA is still one of the strongest privacy laws in the country and has facilitated countless privacy lawsuits. Under this law, people are allowed an individual course of action even if they haven’t suffered direct injury or harm from a company violating BIPA’s requirements. If it hadn’t been for BIPA, Facebook’s face photo-tagging feature would have never been questioned. Cited against Facebook, BIPA led directly to a $650 million class-action settlement and, in the words of U.S. District Court Judge James Donato, a “major win for consumers in the hotly contested area of digital privacy.”

Since BIPA was enacted, other states, including California, Texas, Washington, New York, and Arkansas, have passed their own biometrics laws. Unfortunately, most are too lenient, and the majority don’t allow individuals to take a private right of action. For example, New York City’s Biometric Identifier Information Law lets businesses collect, use, and retain customers’ biometric data as long as they are notified in “plain, simple language.” A number of similar efforts across the US have been derailed thanks to aggressive lobbying from tech companies.

But even BIPA, while a good start, may soon not be enough to protect consumers. Biometrics are too broad and fast-moving a category for such a specific piece of legislation to comprehensively cover. In this regard, the CCPA, which has a broader definition of what constitutes “biometric data,” may actually offer more protection to consumers than BIPA, whose definition is much more narrow.

Any law needs broad coverage and a private right to action

Right now, only a handful of states have biometric legislation in place, and each law differs vastly in how it defines “biometric information” as well as how entities can collect, use, and retain this data. Importantly, only two states (Illinois and California) let individuals confront biometric data abusers without the Attorney General initiating action on their behalf.

Going forward, we need more states to pass biometric laws that allow people to take a private right of action. And, with biometrics evolving at a breakneck pace, we also need these laws to contain relatively broad definitions of what “biometric data” is. Overly narrow biometric laws risk becoming outdated and may end up giving tech companies loopholes to collect and potentially misuse biometric user data further.

The NBIPA is a great example of what any federal or indeed local law should look like. Firstly, it provides a private right of action. However, just as critically, and unlike BIPA, it also defines biometric information more broadly. Under NBIPA, a “biometric identifier” can be a retina or an iris scan, a faceprint, a voiceprint, fingerprint/palmprint, and, significantly, “any other uniquely identifying information based on the characteristics of an individual’s gait or other immutable characteristic of an individual.” In this way, NBIPA includes the possibility of additional identifying features.

Even if NBIPA doesn’t become law, it should become the blueprint for states looking to protect their citizens from private companies’ misuse of biometric identification. As for any other federal privacy law that may be proposed in the future — it needs to build upon, not detract, from the stipulations within NBIPA.

About the author

Rob Shavell is CEO of Abine / DeleteMe, The Online Privacy Company. Rob has been quoted as a privacy expert in the Wall Street Journal, New York Times, The Telegraph, NPR, ABC, NBC, and Fox. Rob is a vocal proponent of privacy legislation reform, including the California Privacy Rights Act (CPRA).

DISCLAIMER: Biometric Update’s Industry Insights are submitted content. The views expressed in this post are that of the author, and don’t necessarily reflect the views of Biometric Update.

Article Topics

 |   |   |   |   |   |   |   |   |   | 

Latest Biometrics News

 

Securing user trust and safeguarding platforms with biometric identity verification

Verified trust is the new currency: so says a new report from reusable verified identity and screening company Trua, looking…

 

Essex Police reveal impressive accuracy of LFR from Corsight, Digital Barriers

England’s Essex Police have performed 383,356 match attempts with live facial recognition software from Corsight AI and Digital Barriers, with…

 

US and UK refusal to sign Paris declaration shows divergence in AI strategy

The U.S. and the UK have declined to sign the Paris AI summit declaration, which seeks to establish a “human…

 

DHS’s compliance with AI privacy, civil liberties requirements lacking, IG says

The Department of Homeland Security (DHS) has made strides in developing policies and frameworks to govern its AI use, including…

 

Precise Biometrics: quarterlies, annuals, SEC actions

Feb 13, 2025 – Net sales for Precise Biometrics rose 15.7 percent percent from 75.1 million Swedish kronor (approximately US$7 million)…

 

YouTube, Meta lean into age assurance in 2025

In the past twelve months, age assurance for online content – a method for knowing that a user is of…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events