FB pixel

EU moves forward on age verification with release of guidelines, software

Five countries to pilot variants of white label age assurance app
Categories Age Assurance  |  Biometrics News
EU moves forward on age verification with release of guidelines, software
 

The European Commission has released both its guidelines for protecting kids online and its “white label” age verification software, which is fully interoperable with incoming EU Digital Identity (EUDI) Wallets.

Changing its messaging somewhat, a release from the commission now calls the “mini wallet” app a free “blueprint” that member states and private sector firms can use to build their own, localized tools.

The software will undergo a pilot phase of testing in Denmark, France, Greece, Italy and Spain, as well as with online platforms, end users and “other interested parties.” However, technical specifications, source code and a beta release of the solution are already published.

“The age verification blueprint is an open-source implementation of these specifications,” the commission says. “It can be easily customized by app publishers, without the possibility, however, to change the privacy-preserving features.”

The system is designed and developed by T-Scy, a conglomeration of Sweden’s Scytáles and Germany’s T-Systems (a subsidiary of Deutsche Telekom), under a two-year contract awarded by the commission in February 2025.

Commission adopts guidelines on protection of minors under DSA 

In pursuing an “EU-harmonized approach to age verification,” the European Commission has also adopted and published its guidelines on the protection of minors under the Digital Services Act (DSA).

A release says the guidelines “set out a non-exhaustive list of proportionate and appropriate measures to protect children from online risks such as grooming, harmful content, problematic and addictive behaviours, as well as cyberbullying and harmful commercial practices.”

They apply to all online platforms accessible to minors, with an exception for micro and small enterprises. Among other measures, they recommend default privacy settings for minors’ accounts, modifying platforms’ targeted recommender systems, giving kids broad block and mute capabilities, and generally protecting kids from commercial practices that may be manipulative or addictive, by using age-appropriate design.

The guidelines also recommend the use of “effective age assurance methods provided that they are accurate, reliable, robust, non-intrusive, and non-discriminatory. In particular, the guidelines recommend age verification methods to restrict access to adult content such as pornography and gambling, or when national rules set a minimum age to access certain services such as defined categories of online social media services.”

A risk-based approach recognizes that online platforms “may pose different types of risks to minors, depending on their nature, size, purpose, and user base. The guidelines enshrine a safety and privacy by design approach and are grounded in children’s rights. Platforms should ensure that the measures they take do not disproportionately or unduly restrict children’s rights.”

Following the guidelines is voluntary and “does not automatically guarantee compliance.”

Feedback on guidelines seeks ‘variety of privacy preserving age assurance’ tools

Along with the guidelines, the commission has published a report summarizing contributions it received during a call for evidence on the guidelines.

The call for evidence ran from July 31 to September 30, 2024. The commission received 174 submissions and 15 additional contributions from “a wide range of respondents.”

The feedback includes questions around scope, and a consensus among stakeholders to exempt micro and small platforms as set out in DSA Article 19. There is advice on alignment with the United Nations Convention on the Rights of the Child (UNCRC). Many agree on the need for online platforms accessible to minors to conduct regular, child-specific impact assessments.

A significant chunk of feedback is devoted to age assurance.

“Several contributions stressed support for age assurance and age verification requirements to be developed and harmonized at EU level. Many inputs stressed that age assurance or age verification solutions alone do not absolve platforms from deploying measures to protect children by default and by design. Many stakeholders requested clarification as to the scope of age assurance or age verification solutions under Article 28. Several contributions stressed the need for age assurance and age verification to be risk based and directed mostly at high-risks platforms e.g. adult services and content.”

Stakeholders identify a need for “a variety of privacy preserving age assurance methods for users to choose from according to their privacy needs.” Age estimation is highlighted “as the most accurate, privacy-preserving and scalable approach” to age assurance. Debates continue about where in the tech stack age assurance should go, with some advocates for the device and operating system level, and others for service-level age verification.

“Some stakeholders stressed for the guidelines to avoid requiring collection or processing of additional personal information for the purposes of age assurance, and to avoid recommending platforms to treat all users like children as it would have the side effect of upsetting adults’ user experience and impinging on their ability to access information.”

Regardless, the EU continues to push the gas pedal on age assurance, and to begin enforcing its new rules; X, TikTok, Facebook and Instagram are all currently being investigated by EU regulators on whether they comply with the DSA. In a statement, Executive VP of the European Commission for Technological Sovereignty, Security and Democracy Henna Virkkunen says “platforms have no excuse to be continuing practices that put children at risk.”

For more insight on the rapidly shifting landscape of age assurance, listen to a recent conversation with Iain Corby, executive director of the Age Verification Providers Association (AVPA), on the Biometric Update podcast.

Related Posts

Article Topics

 |   |   |   |   |   | 

Latest Biometrics News

 

The King’s Speech signals that digital ID in the UK is a go – again

The King hath spoken: his ministers will “proceed with the introduction of Digital ID that will modernise how citizens interact…

 

Digital ID program gets $650M for expansion in Australian federal budget

The Australian government’s 2026-27 Federal Budget includes a major financial commitment to digital ID, in stating that “the Government is…

 

Age assurance industry juggles global headlines, major disruptions at 2026 GAASS

The 2026 Global Age Assurance Standards Summit marked both the arrival of age assurance onto the global main stage, and…

 

Met Police tout arrests, crime drop from permanent LFR camera pilot

The London police have published the results of the UK’s first permanent live facial recognition (LFR) test: During the six-month…

 

Alcatraz AI adds automation, alerts to facial biometric access platform

Alcatraz AI, a facial biometric authentication provider for physical access, has announced a set of platform updates that add audible…

 

Privacy fears rise in New Zealand over AI, biometric data use

A new survey shows that for New Zealanders, concerns about biometric technology and children’s online safety are now common. As…

Comments

One Reply to “EU moves forward on age verification with release of guidelines, software”

  1. It is worth highilighting this new paragraph added to the final version not in the draft issued for consultation, in respect of the use of age estimation

    “Age estimation methods can complement age verification technologies and can be used in addition to the former [sic], or as temporary alternative in particular in cases where verification measures that respect the criteria of effectiveness of age assurance solutions outlined in Section 6.1.4, with particular emphasis on protecting users’ right to privacy and data protection as well as accuracy, are not yet readily available. This transitory period should not extend beyond the first review of these guidelines (38). For example, platforms offering adult-restricted content may use ex ante age estimation methods if they can prove that such methods are comparable to those of age verification, in respect of the criteria set out in Section 6.1.4, in the absence of effective age verification measures (39). The Commission may in due course supplement the present guidelines with a technical analysis on the main existing methods of age estimation.”

    This appears to be a move away from the original position that only age verification methods could be used for legally enforced minimums and high risk harms – which we pointed out was based on a fundamental misunderstanding of the relative reliabity of estimation vs verification.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events