Facial biometrics loophole highlighted by World Privacy Forum report on U.S. schools
One of the top biometrics news stories of the young 2020 calendar year pops up in the middle of the World Privacy Forum’s new report on privacy in U.S. schools, but it is not the sudden mass adoption of digital distance learning during COVID-19 lockdowns. Instead, Clearview AI makes an appearance via the Vermont Data Broker Registry, and illustrates a massive loophole; “directory information” designation, which could be exposing images of millions of children to biometric processing and use in training algorithms.
World Privacy Forum Founder and Executive Director Pam Dixon told Biometric Update in an interview that the group spotted Clearview on January 20, the one data broker in the registry that makes clear it intends to commercially use any images it will find. But how many more companies are doing something similar, and just do not appear in the Vermont registry, is unclear, she says. Somehow, student’s photos have been winding up in open testing databases, such as MegaFace.
Biometrics use in schools, such as for taking attendance, is regulated by the Family Educational Rights and Privacy Act (FERPA), and requires no consent, but the data can never be used for any other purpose. FERPA requires school to hold and manage the data, and also prevents the school from sharing the information, except in rare circumstances, or using it for a reason other than its intended use.
The problem comes in, according to the report “Without Consent,” when it is designated as directory information, which takes it outside scope of all privacy law and allows third parties to use it.
“Those images are just in the wild, and no consent is required at all,” according to Dixon.
Many schools have contracts with yearbook companies that use biometrics, sometimes through contracts with consent requirements that limit put strict limits on how the data is used. Sometimes, it is because school photos have been designated directory information.
WPF had to avoid big areas (such as digital educational platforms) to get through the report, but it does contain a chapter on “Student Biometric Data and FERPA Directory Information.” The issue has become much more prominent over the four years the group worked on the report.
Earlier this year, Dixon notes, a large number of privacy groups signed a letter saying school should not use biometrics.
“The thing is, that’s well intentioned, but the truth is that there are already a lot of laws that regulate biometrics use in schools,” she points out. A selection of them is listed at the end of the report’s biometrics section.
“But no-one really looks at the third-party problem,” Dixon warns. “That’s where I think the real problems are coming in; on social media, on the school websites that are just completely left without any scraping protections.”
It is also a problem that would not be solved by the groups’ proposal.
“Let’s say a school responds to privacy groups’ requests and doesn’t collect biometrics, if they still designate student photographs as directory information, we still have a problem!”
The report identifies a range of other problems with student privacy. FERPA policies are not even made available on many school websites, a problem which is even more pronounced in rural schools (only 11 percent) than in urban ones (a still-meagre 39 percent). People like domestic abuse survivors may need that information to know where they are safe from being found, but meanwhile schools innocently publish photos of students, sometimes lauding their achievements, sometimes pictured in front of their own homes, and sometimes reposted to social media to be made widely available.
“There’s just not a good understanding of some of the baseline safety principles around kids’ photos,” Dixon observes.
Dixon also acknowledges the drive to improve the accuracy of biometric systems to identify children, but says that society needs to draw the line somewhere. “We need to find an ethical, safe and healthy way of making those biometrics better without resorting to what Clearview AI has done,” she argues.
Dixon contributed an article to the ID4Africa 2020 Almanac on “Africa’s Rising Leadership in Privacy,” which was recently published as part of the movement’s preprint series.