Aylo says OSA makes business impossible, blocking adult content sites for new UK users

“Age assurance is failing in the UK.” This is the message Aylo communicated in an informal press conference held this morning, in which it announced that it will be cutting off access to its free sites for new UK users as of February 2, 2026. Age verified UK users will still have access through their existing accounts.
Representatives from adult content network operator Aylo, and its owner, the Canadian firm Ethical Capital Partners (ECP), say that the Online Safety Act as it exists is not working, in that even though Pornhub’s traffic has cratered, the law has not made it more difficult for underage users in the UK to access adult content.
The company has already blocked access to its sites in France and 22 U.S. states, in response to age assurance laws.
“What we’ve seen is more of the same, so that adult websites still remain very accessible to minors,” says Alexazandra Kekesi, VP of brand and community at Aylo. “Those websites, for the vast majority, do not have any kind of compliance protocol in place. They do not moderate their content.”
In a demo illustrating the problem, Solomon Friedman, partner and VP compliance for ECP, shows that a basic Google search for “free porn” conducted from a UK IP address brings up a list of search results that put Pornhub, which is compliant, first – but also include noncompliant sites that have not implemented age verification or age estimation measures.
“FreePorno.xxx is the second most popular result,” Friedman says. “It is completely noncompliant. Six out of 10 of the results on the front page of those Google results have no age assurance whatsoever and allow adults and children alike to access explicit content.” These are newer (or newly discovered) streaming sites that have easily made their way up the search results by not following the rules.
Moreover, Friedman alleges, “those sites that do not comply with age assurance, they don’t comply with other content restrictions either. They’re not verifying identity. They’re not verifying the age of uploaders. They’re not verifying and moderating the nature of the content. They don’t care about the laws under the Online Safety Act. They don’t care about intimate image abuse.
They don’t care about child sex abuse material. They don’t comply. So this law by its very nature is pushing adults and children alike to the cesspools of the internet, to the most dangerous material possible.”
The argument has more teeth for porn than for the social media giants that have attempted to co-opt it. A porn site, it would seem, is easy enough to set up, and does not depend like social platforms on a connected network of users. Indeed, the issue to begin with is that most porn users want to be anonymous. In some cases – perhaps many – that wish will override concerns about ethically-sourced porn.
Time for Apple, Google, Microsoft to step up
Friedman also uses the demo to argue for Aylo’s preferred solution: making Google, Apple and Microsoft use their existing parental controls to take care of age checks at the device or OS level. “Microsoft, Apple, and Google all have very robust built-in parental controls. Those are device-based controls that operate regardless of whether or not the site that is being accessed is compliant, that operate regardless of whether a VPN is being used, and they are effective.”
Friedman notes the debate currently raging in United Kingdom over whether or not to ban VPNs, “because anyone who has even the most passing familiarity with the internet knows that a VPN can be used to geolocate oneself outside of a jurisdiction and thereby easily circumvent the region-specific age assurance laws.”
“Do we block VPNs? Do we restrict them for children? This is a nonsensical debate when device controls are enabled. When access is controlled at the device level, it’s efficient and it’s effective. It’s privacy-preserving. It gets the job done.”
In summary, Aylo and ECP believe that “either Microsoft, Apple, and Google can do the right thing proactively, or they can be forced to do the right thing by government.”
This volleys the legal and liability issue back to the app store operators or device manufacturers, in keeping with the social media titans’ arguments that age assurance at the platform level doesn’t work as it should. It is consistent with Aylo’s position throughout; the company has already sent letters to the companies to make their case for default parental controls.
Aylo and ECP are careful not to point the blame at Ofcom, which it calls “a dedicated regulator working in good faith” to enforce the law.
“The problem here, however, is not the regulator,” Friedman says. “It is the law. This is a law which has set Ofcom up to fail.”
Aylo proposal comes with risks of its own – and opponents
Three big hurdles stand between Aylo and device-based age controls. The first is volume: making the strongest parental control settings the default punts age verification beyond those who want to access porn, making it necessary for anyone accessing any kind of age-restricted content to perform an age check. In the long term, this could lead to a great burnishing of the Internet, and the risk is that the web gets cast too wide, and ends up blocking (for instance) vulnerable LGBTQ+ teens from accessing community resources.
The second is more conceptual. Aylo’s presentation begins with the assertion that it “made the decision to comply with the Online Safety Act in the UK as it was written.” This lays bare the hard truth that regulatory compliance is still seen, ultimately, as a choice, and not a law with established consequences.
The third is the biggest: the three-headed monster of Apple, Google and Microsoft, whose legal funds are effectively infinite, and who are just as uninterested in managing age assurance as Pornhub.
Article Topics
age verification | Aylo | biometric age estimation | children | Children’s Wellbeing and Schools Bill | device-based age verification | Online Safety Act | UK age verification | VPN (virtual private network)







Comments