Regulation Day approaches in the UK as child online safety laws kick into high gear

The deadline is fast approaching for UK regulator Ofcom’s Children’s Access Assessment, which requires content providers that offer user-to-user services in the UK – including social media platforms, gaming platforms and dating apps – to tell Ofcom whether or not kids can access their site. Their answers, which are due by Wednesday, April 16, will determine whether or not businesses are in compliance with the Online Safety Act (OSA) and the forthcoming Protection of Children Codes – and whether they face enforcement measures.
The online child protection measures are being rolled out in a phased fashion, but age assurance vendors know full well what’s coming. At the Global Age Assurance Standards Summit, there is much talk of the forthcoming regulations and what can be done to comply – although the general opinion seems to be that the new rules have not yet triggered a spike in demand for age assurance tech. One delegate suspects affected industries are about to see an explosion of penalties, as enforcement commences.
According to a blog from Brett Wilson LLP, the UK Government estimates that around 25,000 businesses will be subject to the new rules. User-to-user services constitute the largest chunk. They include social media platforms (like Meta and X), messaging services (like Messenger), video-sharing services (like TikTok and YouTube), marketplace and listing services (like eBay and Amazon), and file-sharing services (like Microsoft OneDrive and Google Drive).
Search, including general search like Google and product-specific search services like Skyscanner, is also covered, as are internet services that publish or display pornographic content.
The service does not have to operate out of the UK. It applies to any service that has “links with the UK,” defined as having a significant number of UK users, identifying the UK as a target market, or is capable of being accessed by UK users, and presents the risk of harm.
The highest risk platforms – the social media and pornography sites with the largest number of users – are designated as category 1 and bear the highest duty of care. Category 2A will contain the highest-risk search engines, such as Google and Bing, and category 2B will contain the remaining high-risk and high-reach sites.
In comments sent to Biometric Update, Lina Ghazal, head of regulatory and public affairs at age assurance provider VerifyMy, says next Wednesday is the deadline for these services to have “done their homework – an access assessment into whether children can use them. Only platforms with highly effective age assurance methods in place, such as email-based age estimation, that prevent children from accessing the service, will be exempt from conducting a children’s risk assessment and implementing the upcoming Protection of Children Codes.”
“Tech companies must now go beyond basic compliance and adopt a safety-by-design mentality that prioritises child safety. This is not just a regulatory requirement, but a moral responsibility that the industry must take seriously.”
To enforce the laws, Ofcom has been granted the power to fine companies up to the greater of £18 million (~US$1.29M) or 10 percent of qualifying worldwide revenue, and can apply for a court order to block an online platform in the UK.
Gaming needs a buzzkill-free form of ‘invisible’ age assurance
In a presentation at the summit, VerifyMy’s Chris Murphy shines a light on the specific challenges facing the $400 billion games industry. Banning young users from games that only contain some restricted content ends up limiting their access to content that’s deemed perfectly fine. The answer, says VerifyMy, is to scale age assurance within an experience, to create tiers of access. So, if a user is playing a game but wants to enter a chat, they can be asked for proof of age for that specific action.
The problem is, gaming is immersive and invites players to play for long periods of uninterrupted time. Simply, being interrupted by an age assurance prompt could be a major killjoy. Thus, the need for risk-based age checks that do not interrupt the user flow.
“How do we move beyond the gate, and think about age assurance as connected ecosystems?” Murphy asks.
The session also includes Luc Delany from k-ID, a firm that aggregates regulatory requirements around the world to automate compliance. Delaney points out that what the industry calls “live service games” are “more and more like social media experiences,” increasingly involving social and financial transactions.
The combined challenges of wanting to maintain the user experience while also satisfying regulatory compliance – and keeping kids away from adult content – calls for “more invisible safety protocols,” which can be accomplished with certain types of biometric and non-biometric age assurance. (VerifyMy, for instance, is the pioneer in email age inference.)
Regardless of exactly how and when it’s done – still, as the summit suggests, very much a debate – Delaney believes age assurance needs to be responsive and adaptable as gamers grow. “Things need to scale and change as a user learns,” he says.
One thing seems certain: as regulators around the world come to understand that the internet can be regulated, certified firms establish a stable industry, and international standards come into play, age assurance – for gaming, porn, social media and other online activities – is on its way to being normalized.
2025 Online Biometric Age Assurance Market Report & Buyers Guide – UK Edition
Article Topics
Age Assurance Standards Summit (2025) | age verification | children | gaming | k-ID | Ofcom | Online Safety Act | regulation | social media | UK | VerifyMy
Comments