Unclear if biometric privacy law applies to algorithm training, though BIPA decisions clarify scope
Biometric privacy laws may apply only to processes in which individuals are identified, Theodore F. Claypoole of Womble Bond Dickenson (U.S.) LLP writes for The National Law Review, which could sink a series of recent lawsuits filed under Illlinois’ Biometric Information Privacy Act (BIPA).
The recently-filed round of suits targets tech giants, and according to Infosecurity also facial recognition provider FaceFirst. A company representative confirmed the suit in an email to Biometric Update, but declined to comment.
FaceFirst has avoided previous legal entanglements that have ensnared some biometrics companies, as its technology operates on an opt-in basis. The company is also alleged to have used IBM’s Diversity in Faces training dataset to improve the performance of its algorithm for different demographics.
Claypoole points out that contrary to common opinion, there are no rules about how private companies can use the likeness of an individual. Given the definition of biometrics in BIPA, and the specific exclusion of photographs as “biometric identifiers” or “biometric information,” using a photo to train an algorithm on how to distinguish between facial features without identifying the individual pictured may be interpreted as not a biometric process, and therefore not eligible for BIPA protections.
Training AI systems is an area of both law and ethics which remains largely unexplored, Claypoole argues, privacy concerns seem to be eliminated when a picture is neither associated with an individual nor reproduced for public viewing. Plaintiffs in the case request that the defendants destroy any relevant facial data they have stored, but as Claypoole writes, but it is not clear how this could be carried out, as the dataset is not owned by the defendants, and there is no known way to determine what the system has gained from any particular image.
“How much privacy is at stake here?” Claypoole asks. “And will the defendant companies settle the matter, or have the stomach to stick it out and make an important distinction in the law?”
The attorney has also recently written for the Review that the U.S. warrant system can provide the necessary protections against potential harms from facial recognition.
Decisions reveal scope and effectiveness of strategies
The scope of BIPA has been clarified in ways variously beneficial to plaintiffs and defendants, attorneys with Eversheds Sutherland LLP write for ALM’s Cybersecurity Law & Strategy.
An Appeals Court decision in Bryant v. Compass Group USA, Inc. that the plaintiff suffered an injury-in-fact, and therefore has standing in federal court, establishes a federal precedent for standing. According to the Seventh Circuit decision, the violation is equivalent to an act of trespass, and also the withholding of information by the defendant creates an informational injury.
Recent decisions have established limits to the jurisdiction of BIPA over biometric technology suppliers based out of state with little business there, but also its applicability to a company that sold thousands of biometric time and attendance tracking devices in the state. In See, Vo v. VSP Retail Dev. Holding, Inc. the court ruled that a service for virtually “trying on” eyeglasses is covered by BIPA’s healthcare exemption, as the service involves prescription eyewear, and replaces a process that would typically be performed by an eye care professional.
Multiple suits have also shown that federal employment laws can pre-empt BIPA claims.
A suit filed against Loews Chicago Hotel in 2018 by an ex-employee for alleged violations with a biometric time and attendance tracking system has reached a $1.05 million preliminary settlement, Law360 reports. Plaintiff Tekita Bryant has asked a District Court judge for preliminary approval of the deal, and for conditional certification of a class consisting of more than 1,200 Loews employees.