FB pixel

Age assurance and children’s data privacy regulations come with endless complexity

Regulatory approaches in EU, UK and U.S. share goals in a crowded legal landscape
Age assurance and children’s data privacy regulations come with endless complexity

Privacy regulation is “an incredibly complex regulatory landscape,” says Claire Quinn, chief data privacy officer for Privo, in a panel focused on children’s online data privacy and age assurance,  hosted as part of the Mobile Ecosystem Forum in London.

The theme of complexity recurs throughout the discussion, which also features Denise Tayloe, president and CEO of Privo; Katerina Tassi, senior associate for the privacy and data protection group at law firm Bird & Bird; and Ian Deasha, group manager of technology (identity and trust) for the Information Commissioner’s Office (ICO). With varying approaches to children’s privacy and digital identity evolving quickly around the world, the mosaic of laws, considerations and potential solutions in play is gobsmacking.

Age of consent an issue across EU states; more enforcement happening

Tassi’s overview of the EU landscape points to the GDPR as its foundation. But she notes that the GDPR “includes limited references to children’s data and does not contain explicit obligations to carry out age assurance” – even if certain definitions and restrictions imply its necessity.

Among the complicating factors is fragmentation across member states in the matter of the age of digital consent, which can vary from nation to nation. There is activity among states to align on child privacy laws and age assurance; governments in France and Spain both have children’s privacy initiatives underway. But the matter is complex, given the variety of laws and regulations already in place.

Tassi says that the biggest impact may come from enforcement, which is becoming more common; she notes decisions in Ireland and the UK regarding TikTok.

UK calls for risk-based consideration in determining age assurance needs

Regarding the UK, she notes the Children’s Code and the ICO’s opinion on age assurance as proof that children’s digital privacy remains a hot topic there.

Ian Deasha concurs. Speaking on the ICO’s standards, he clarifies that age assurance is not mandatory. Content providers can also choose to simply make their sites safe for kids. “Our opinion refers to age assurance as one of two things,” he says.

“Organizations can provide by default an age appropriate, kid-safe space, or they can go the other way, and that’s where we’re talking about age assurance.” Sites can “provide a level of assurance appropriate to ensure that children cannot or should not be able to access a particular type of resource,” he says.

“This is a risk-based consideration, so the level of assurance you need to have needs to be directly related to the risk to the rights of the children who are accessing your platform, because of the kind of processing that you are doing.” Deasha points to facial age verification and other biometric modalities as solutions that exist and continue to evolve alongside age-appropriate design standards.

U.S. forced to wrangle with state laws and Big Tech lobby

Perhaps most complex of all is the U.S., where a scramble to enact privacy laws on both the state and federal level has led to what Denise Tayloe calls “a tsunami of new regulations.”

The Children’s Online Privacy Protection Act (COPPA) is still the big cheese of American privacy law, and Tayloe confirms that changes are coming in COPPA 2.0 that will address the age of consent to information, targeted ads and acceptable methods of verifiable parental consent.

She characterizes the latter as a shift from consent to knowledge: “It’s not about permissioning, it’s not parental consent, but rather parental tools to know your kid has an account and to perhaps restrict the disclosure of their data and the time that they may be able to be on.”

Time is a particular issue in the case of social media, which has been the prompt for a number of laws that make up the overall gumbo. She notes that over half the Senate is on board with the Kids Online Safety Act (KOSA), which “requires social media platforms to put the well-being of children first and provide them an environment that’s safe by default.” That, she notes, “starts to look like what we’ve seen in Europe.”

Still, the U.S. will never concede to regulations without a fight. U.S. states that have privacy legislation addressing kids’ data’ in place or in process include California, Maryland, Vermont, Minnesota, Nevada and New Mexico. New York is pushing the “Stop Addictive Feeds Exploitation” (SAFE), limiting how content feeds can be targeted to youth. And there are moves to enact federal privacy legislation with APRA. But Big Tech has mustered its lobbyists to push back; Quinn points out that Maryland removed an explicit age assurance mandate from its law in part as a way to push it through without facing an injunction from social media firms.

In brief, the subject of age assurance across jurisdictions, cultures and use cases is practically incompatible with brevity. Those hoping for a killer app to emerge will be waiting for a long time. Meanwhile, a selection of age assurance, age verification and age estimation tools are available. To quote Ian Deasha, “It will be important to consider what is appropriate for each organization depending on how their services are built and what their objectives are.”

Again, to quote Socure’s Josh Linn writing in a blog about the complexities of age verification, “age verification online is incredibly complex and has no perfect solution.”

Yoti celebrates NIST evaluation

A National Institute of Standards and Technology (NIST) evaluation has shown that Yoti facial age estimation is “an effective, fair and inclusive way to check age and age ranges, such as over 18, with an appropriate threshold,” according to a blog post from the biometrics company.

Paco Garcia, CTO at Yoti, says the evaluation “will provide scientific certainty for businesses and regulators that facial age estimation is an accurate, fair and privacy-preserving age assurance solution. This comes at a time when there is increasing legislation globally demanding that organizations effectively check the age of their users.”

NIST evaluated Yoti’s facial age estimation as part of their Face Analysis Technology Evaluation (FATE) program.

“Parents, businesses and regulators need a competitive age assurance market with independent testing of facial age estimation algorithms to ensure they are effective and fair,” says Yoti CEO Robin Tombs, in a post on LinkedIn. “Five years ago quite a few otherwise smart people dismissed our FAE as fake science or said that we could not be trusted to measure our performance accurately. Some still don’t get that facial age estimation can be completed without performing facial recognition.”

“NIST makes it super clear that: ‘Age estimation is not Face Recognition. AE analyzes one face to produce an estimate of age. FR is concerned with who is in an image. The two techniques employ different algorithmic machinery for these two purposes.’”

Related Posts

Article Topics

 |   |   |   |   |   |   |   | 

Latest Biometrics News


New FaceTec CLO among avalanche of appointments in biometrics and fraud protection

New executives have been named by biometrics providers FaceTec, Pindrop and Fingerprint Cards, along with C-level appointments by Prove and…


Indonesia issues call for World Bank-backed digital identification project

Indonesia is looking for a company providing consulting services as a part of its upcoming digital transformation project backed by…


Affinidi data sharing framework leverages privacy-preserving open standards

Affinidi, a company specializing in data and identity management, unveiled the Affinidi Iota framework at the WeAreDevelopers World Congress. This…


Sri Lanka set for January biometric passport launch, plans airport upgrades

Sri Lanka is preparing to begin issuing biometric passports with electronic chips embedded as of January, 2025, according to a…


Vending machines with biometric age verification roll out in Germany, US

Vending machines are growing in popularity as a way to sell age-restricted products around the world, with Diebold Nixdorf algorithms…


San Francisco police hit with lawsuit over facial recognition use

In 2019, San Francisco became the first city in the U.S. to ban facial recognition technology, forcing the police and…


Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Read This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events