Privacy-preserving age assurance has arrived; now, it has to keep its promises

The Final Communiqué from the 2026 Global Age Assurance Standards Summit is now available. Summarizing learnings and takeaways from the summit, which took place in Manchester in April, the Communiqué “represents a consensus snapshot of the state of play in age assurance practice, policy and standardization” – and declares confidently that “age assurance has come of age.”
“The global conversation has entered a new phase. It has moved from whether age assurance can be done, to how it must be implemented – lawfully, proportionately, and with respect for fundamental rights.”
This will come as no surprise to anyone working in the age assurance sector, who may have noticed friends and relatives taking a keener interest in their field as of late. Age assurance has gone mainstream, but just; the industry is still working to stabilize definitions, establish concrete benchmarks for compliance purposes, and adapt to rapid changes.
Even the definition laid out in ISO/IEC 27566-1: 2025, the first ISO/IEC standard on age assurance, illustrates the challenge of keeping pace with innovation: age assurance, it says, “is a set of processes and methods used to verify, estimate or infer the age or age range of an individual, enabling organizations to make age-related eligibility decisions with varying degrees of certainty.”
For now, then, age assurance covers age verification – still often inaccurately used as a blanket term, particularly in the U.S. – age estimation, and age inference methods. The summit, however, introduced a handful of models that could fall outside this framework, indicating that the goal of establishing common terminology is an ongoing process. There are similar sentiments expressed in comments posted to the Communiqué announcement on LinkedIn, noting the absence of methods tied to mobile network operators and telco-anchored blind assurance.
To that end, the Communiqué notes “the continuing development of ISO/IEC 27566-2 and ISO/IEC 27566-3,” and welcomes the publication of IEEE 2089.1-2024, “reflecting growing alignment between international standards bodies.” This sets the stage for global consistency in providing “a structured, interoperable and technology-neutral framework for proportionate decision-making.”
“Age assurance has entered a phase of global implementation in which standards, regulation, certification and enforcement must operate coherently to protect children’s rights while preserving privacy, proportionality and trust in the digital environment.”
Communiqué: show your work, respect kids’ privacy
The Communiqué includes six calls to action, collectively based on six guiding principles. The calls to action reflect a landscape that is working through real-world challenges in providing privacy-preserving age checks that meet the standard for widespread public trust.
Number one is to align enforcement with international standards, to “promote clarity, reduce fragmentation and enable consistent, auditable compliance across jurisdictions.”
Number two is to move from deployment to “demonstrable assurance.” Which is to say, any entity implementing age assurance should be able to clearly demonstrate why it is needed, what level is proportionate, how the chosen method aligns with standards, and “how fundamental rights risks have been mitigated.”
Number three is to protect children without creating surveillance. “Age assurance must not become a mechanism for revealing identity, a tool for persistent tracking or cross-service correlation of user activity, a gateway to disproportionate data collection, or a system to exclude children or adults from the digital environment.”
Number four is to embed human rights impact assessments into deployments, to ensure ongoing respect for privacy, data protection, equality, accessibility and user experience.
Number five is to address global interoperability by working toward common reference points and concepts, and interoperable digital credentials.
And, number six is to support inclusion, taking extra care to factor in the needs of “individuals without formal identity documentation or who are not literate, users in low-connectivity environments, shared device contexts, and persons with a limited capacity to understand or navigate age assurance processes.”
These calls together are based on the guiding principles of human rights and the best interests of the child; risk-based and proportionate implementation; privacy and data minimization by design; standards-based interoperability and technical integrity; digital inclusion and accessibility; and transparency, accountability and independent assurance.
Providers hold the cards and must play fair
Each principle is attached to several guidelines, which drill down into issues like data retention and consumer choice. A key passage from the privacy by design section says that “privacy-enhancing approaches – including attribute-based credentials, zero-knowledge mechanisms and decentralised models – should be preferred by default where they can meet the relevant assurance need. Age assurance must not become a gateway to identity surveillance or function creep.”
But the biggest issue for age assurance in the long term is perhaps most clearly articulated in the final guiding principle. “In an era of active enforcement and significant financial penalties for non-compliance, coupled with a dynamic and competitive marketplace of providers, transparency and accountability are essential components of lawful and trustworthy age assurance.”
This statement contains the crux of what the 2026 Communiqué has to say. The previous edition declared that privacy preserving age assurance was possible, and encouraged relying parties to get on board. Since then, the world has given the age check sector a chance to prove it can do what it claims. It is now on the industry to make good on its promise – and what’s at stake is nothing less than a foundational trust in an industry on the threshold of long-term stability.
Article Topics
age assurance standard | Age Assurance Standards Summit (2026) | age verification | children | data privacy





Comments