Age verification laws in the UK and US reassessed
While lawmakers around the world remind us to think of the children, digital identity watchdogs and citizen groups say restrictive regulations around age verification put a chill on internet access, enable mass surveillance – and are ineffective.
ICO issues updates to stance on age assurance in the UK
In the UK, the Information Commissioner’s office (ICO) has updated its renewed 2021 age assurance Opinion. A news release says the refreshed Opinion has guidelines for services that are “likely to be accessed by children” per the Children’s code, explains how the ICO expects online services to apply age assurance measures that use the data of minors, and provides support on navigating legislative changes and compliance with the Online Safety Act 2023.
The update, which incorporates input from expert focus groups, research, voluntary audits and consultation with Ofcom, was built into the document on its publication, to account for rapid legal, technological and social developments.
For biometrics firms, the terms remain largely unchanged, in that biometric data is classified as “special category data” under the UK GDPR, deemed worthy of extra data protection measures. The Opinion does, however, differentiate between biometric recognition technologies, which “process biometric data for the purpose of unique identification,” and age estimation tools that “may use biometrics for face or voice analysis and classification to provide an estimate of a person’s age.” The ICO says the former – “whenever you use biometric data for the purpose of uniquely identifying someone” – is always “special category data.”
Furthermore, reads the opinion, “before processing special category biometric data, or if the solution you are using is AI-driven, you must complete a DPIA (Data Protection Impact Assessment). This documents your purpose for processing this information, and assesses and manages any risks which may arise.”
Finally, it states that, “to process special category biometric data, you must identify a valid Article 9 condition for processing.” Article 9 refers to the section of the GDPR that defines special category data; its conditions for processing data include requirements for explicit consent, necessity and legal context. Of special note is section 9(g), which allows data processing that is “necessary for reasons of substantial public interest.” Per the Opinion, “Assuming it is proportionate for your service to use biometric data for age assurance, then it is likely that you can apply the condition for substantial public interest.”
Biometrics firms have pushed back on previous ICO Opinions, citing technical differences between systems that measure biometric facial geometry, and those (such as Yoti’s) that do not identify the user in initial or subsequent submissions.
The ICO also recently published draft guidance on biometric data, which covers how the UK GDPR applies to the use of biometric recognition systems.
New US age verification rules prompt lawsuits, disdain from rights groups
Age restrictions on online platforms are a hotter potato in the U.S., where rights and freedoms figure more into national identity than they do in monarch-bound Britain. In New York, Governor Kathy Hochul is taking heat for endorsing proposed state legislation that would require social media users to verify their age. Introduced last fall by New York Attorney General Letitia James, the proposed age assurance bill is a compound of two laws: the New York Child Data Protection Act, an update on an older law, which would prohibit social media companies from collecting and selling the personal information of minors for advertising purposes without their express informed consent; and the Stop Addictive Feeds Exploitation (SAFE) for Kids Act, which would rein in addictive tech features such as curated algorithmic feeds.
In a release, privacy and civil rights group the Surveillance Technology Oversight Project (S.T.O.P.) says that “invasive and discriminatory technologies” for age verification are a non-starter, in that they do not work as advertised and cause additional harm by limiting access.
“There simply is no technology that can prove New Yorkers’ ages without undermining their privacy,” says S.T.O.P. Executive Director Albert Fox Cahn. “Many of the measures required in other jurisdictions are child’s play to break, while making the internet inaccessible for seniors, immigrants, and low-income communities.”
Across the country in Utah, opponents of social media age verification laws are taking their case to court. Law360 reports that a group of Utah residents, together with the Foundation for Individual Rights and Expression (FIRE) are suing state officials over the Utah Social Media Regulation Act, which requires minors to obtain permission to use social media from parents or guardians. The plaintiffs claim that the law, passed last year and set to take effect in March, violates the First and Fourteenth Amendments.
A statement from the plaintiffs calls the law well-intentioned but misguided, and says there is clear overreach in applying age verification rules to everyone when they are only intended to protect children. “The law subjects all Utahns to intrusive and imperfect age-verification mandates before they can access vast numbers of interactive services that permit the sharing of expression, compromising their privacy and chilling speech,” it says, comparing gated age verification to keeping “all books under lock and key because some are inappropriate for some children.”
In December 2023, the Federal Trade Commission amended the Children’s Online Privacy Protection Act (COPPA) with new rules on the collection and retention of children’s data, and specific attention paid to targeted advertising. An article in Law360 says the FTC expects to engage in a lengthy feedback process over the proposed COPPA changes, which include a rule requiring companies that have obtained parental consent to collect a child’s data go back and obtain separate permission before disclosing the data to advertisers. Another change seeks to add the term “biometric identifiers” to the FTC’s definition of personal information.
One thing is very clear: laws everywhere will continue to change in the effort to make the internet safe for kids, and legal arguments will follow. Meanwhile, businesses will be wise to try and keep abreast of new legislation. Law360 quotes Alfred R. Brunetti of New Jersey legal firm Porzio Bromberg & Newman: “The savvy and more thoughtful businesses out there will actively be keeping track of this to make sure their business goals are aligned with the regulations that are emerging,” says Brunetti. “The better and more prepared companies are, the easier the road ahead is going to be.”
Article Topics
age verification | biometrics | children | data privacy | legislation | surveillance | UK | United States
Comments