EU seeks feedback on age assurance, child protection under DSA

The EU has issued an official call for feedback for guidelines on protecting minors online under the Digital Services Act (DSA), including on age assurance measures that platforms should introduce to limit children’s exposure to pornography and other age-inappropriate content.
The draft guidelines are open for public feedback until June 10th, 2025, while their publication is expected during the summer.
The recommendations are based on the DSA’s Article 28, which has been at the center of the European Board for Digital Services age assurance discussions. The new guidelines, based on the article, are targeted at all platforms accessible by children, except those from micro- and small enterprises. They divide all age assurance measures into three categories: self-declaration; age estimation, which is most commonly carried out with face biometrics; and age verification.
At the same time, the European Commission is also working on an age assurance app which will allow online service providers to check whether users are under the age of 18. A beta version of the white label app is in development from Deutsche Telekom and Swedish biometrics firm Scytáles and has been made available on GitHub.
“The aim of the project is to develop an EU harmonised privacy-preserving age verification solution, including a white-label open-source app by summer 2025,” the Commission says.
The app, however, has already seen criticism from advocacy groups such as European Digital Rights (EDRi), which claims that age assurance tools “systematically leak data in ways that threaten the privacy and data protection rights of children and adults alike.”
The app represents an interim solution until the EU Digital Identity (EUDI) Wallet becomes available by the end of 2026 and is based on the same technology as the EUDI Wallet.
The DSA guidelines offer a risk-based approach that platforms can tailor to specific services without restricting minors from their right to participate and express themselves in the online sphere, according to the Commission.
Aside from age assurance measures, the guidelines also recommend that platforms set children’s accounts as private by default and adjust their recommendation systems to reduce the risk of children ending up in rabbit holes of harmful content. Another suggestion is allowing kids to block and mute users and ensuring they cannot be added to groups without explicit agreement to prevent bullying.
Article Topics
age verification | Article 28 | biometrics | children | data privacy | data protection | Digital Services Act | EU | Europe | European Commission
Comments