New proposal pushes EU-wide digital minimum age of 16 for social media

EU lawmakers want children to be protected online, and some of them say platforms aren’t doing enough, fast enough. New statistics from a Eurobarometer survey show that 74 percent of 15-24-year-olds in the EU follow influencers or content creators, and 65 percent prefer social media as their main news source. Now, members of the European Parliament (MEPs) are calling for an EU-wide restriction on social media for anyone under 16, unless they attain express parental consent.
MEPs want Europe to crack the whip on DSA
A release from the European Parliament says MEPs from the Internal Market and Consumer Protection Committee have adopted a report expressing “concerns over major online platforms’ failure to protect minors adequately.” The report calls for a ban on harmful practices such as addictive design and gambling-like game features, as well as “an EU-wide digital minimum age of 16 for access to social media, video sharing platforms and AI (artificial intelligence) companions, unless authorised by parents, and a minimum age of 13 to access any social media.”
Addiction, mental health, and exposure to illegal and harmful content are cited as primary concerns driving the proposal. Also at issue is perceived laxity on the part of regulators in enforcing the Digital Safety Act (DSA). “The MEPs urge the Commission to make full use of its powers under the DSA, including issuing fines or, as a last resort, banning non-compliant sites or applications that endanger minors.”
Do more, faster, say the MEPs. That includes supporting the European Commission’s efforts to develop privacy-preserving age assurance systems. It means banning engagement-based recommender algorithms for minors and disabling the most addictive design features by default; infinite scrolling, autoplay, disappearing stories, and harmful gamification practices are no longer welcome. It means addressing the ethical and legal challenges arising from AI nudify apps, and firmly enforcing AI Act rules against manipulative and deceptive chatbots. It bans “loot boxes” in games accessible to minors and prohibits platforms from monetizing “kidfluencing” efforts.
Failed age checks could cost Zuckerberg a few jets
And it means getting personal: “consider introducing personal liability for senior management in cases of serious and persistent breaches of minor protection provisions, with particular respect to age verification.” According to Politico, the suggestion was put forward by Hungarian conservative lawmaker Dóra Dávid, a former Meta employee.
Some of this, the MEPs believe, can be addressed in the future Digital Fairness Act. But the urgency is focused on two main objectives.
“Our report clearly states the need for increased protection of minors online in two respects,” says Rapporteur Christel Schaldemose. “Firstly, we need a higher bar for access to social media, which is why we propose an EU-wide minimum age of 16. Secondly, we need stronger safeguards for minors using online services. My report calls for mandatory safety-by-design and for a ban on the most harmful engagement mechanisms for minors.”
Euractiv has an interview with Schaldemose in which she says the idea is to avoid a tangle of different rules across the EU, and states that “a crystal-clear obligation for companies to use age verification is required so that it cannot be challenged in court.”
The European Parliament is set to vote on online safety recommendations between November 24 and 27.
Article Topics
age verification | biometric age estimation | biometrics | children | digital identity | Digital Services Act | EU age verification | Europe | regulation | social media







Comments