US courts training plaintiffs and defendants on repurposing biometric data
A potential class action alleging biometric data privacy violations has been filed against photo storage site Photobucket. The twist is that unlike the litany of lawsuits under Illinois’ Biometric Information Privacy Act, Photobucket is alleged to have violated statutes in several different states.
Photobucket recently changed its privacy policy so that it could sell the images stored in dormant accounts, and the face and iris biometric data they contain, as training data for generative artificial intelligence (genAI) models, Ars Technica reports.
The story is essentially one of a company with a dwindling business finding a new way to monetize data it has collected.
The complaint cites BIPA, but also similar laws in New York, California and Virginia, along with consumer protection laws in states like Colorado.
Photobucket sent emails to users with dormant accounts, instructing them to consent to the use of their biometric data or face its deletion, the plaintiffs say, which does not qualify as informed consent. Plaintiffs call the emails “fraudulent and coercive,” and say they presented links to delete information that instead led to pages for accepting the new terms. Those who ignored the email were considered to have opted in after 45 days, the allege.
More than 100 million users could be eligible for damages. At least one Illinois user has been told by Photobucket that his biometric data may have already been sold.
The companies that bought the data, though yet unnamed, are also targeted by the lawsuit.
Repurposing images that contain biometric data has proven thorny legal ground, but the courts are recognizing a difference between doing so by selling them and by using them for algorithm training.
Diversity in rulings for and against Google
Google has partially succeeded in getting claims of biometric data privacy violations against it due to its use of IBM’s Diversity in Faces dataset thrown out.
A California federal judge struck down plaintiffs’ claim under BIPA Section 15(c) and of relief for unjust enrichment in the form of “restitution and disgorgement,” meaning money. The motions to dismiss a relief from unjust enrichment claim in the form of an injunction, however, was denied, along with the claim under BIPA Section 15(b).
Section 15(b) forbids the collection of biometric data without informed written consent. Section 15(c) stipulates that businesses cannot sell or “otherwise profit from” sharing or disseminating people’s biometrics.
Google argued that, as ruled in a similar case against Microsoft, the way it allegedly profited from the data, in this case by using them to train an algorithm, is not covered under 15(c), and that the collection of biometric data did not occur in Illinois, so BIPA has no jurisdiction. Further, no evidence was presented that plaintiff’s Pixel phones were improved with the data in question.
District Judge Beth Labson Freeman ruled (via Law360) that the alleged violations could be inferred to have occurred “primarily and substantially” in Illinois, keeping the 15(b) claim alive, and that plaintiffs can pursue an injunction to prevent future harms under on grounds of unjust enrichment.
Plaintiff Tim Janecyk has been pursuing damages related to the Diversity in Faces dataset since the beginning of 2020, when he filed suit against IBM. He was joined that year by Steven Vance, the other named plaintiff in the suit against Google.
Article Topics
Biometric Information Privacy Act (BIPA) | biometrics | data privacy | dataset | face biometrics | facial recognition | Google | lawsuits | Photobucket
Comments