FB pixel

UK age verification is here: Ofcom set to begin enforcing Online Safety Act

Sites not complying with child age checks subject to huge fines
Categories Age Assurance  |  Biometrics News
UK age verification is here: Ofcom set to begin enforcing Online Safety Act
 

The age assurance bell is tolling in the UK, where the Online Safety Act is set to take effect. The July 25 deadline has arrived and national regulator Ofcom is expected to commence enforcement of the Children’s Codes that require platforms to prevent minors from accessing harmful content such as self-harm, suicide and eating disorder content, as well as pornography. That means adult content sites operating without online age checks for their users will be subject to huge fines of up to 18 million pounds (about 2.4 million dollars U.S.) or 10 percent of global turnover, whichever is greater. 

For users, the regulator has published a guide on what to expect, which also outlines the permitted methods for sites to check a user’s age: facial age estimation (FAE), open banking, digital ID services including wallets, credit card age checks, email-based age estimation or inference, mobile network operator age checks, and photo ID matching. (Those with a deeper attention span can watch adult star Ivy Maddox read the full text of the OSA.) 

The UK age assurance sector has been anticipating this date (dubbed “AV Day” in industry lingo) for some time. The regulatory rigor that Ofcom has promised suggests an increased demand for age verification and age estimation products and services, as digital age checks stand to become common practice. Biometric Update and Goode Intelligence have published a Market Report and Buyer’s Guide for the UK market to help customers navigate the shift. 

Some sites covered under the OSA and the Children’s Codes have already pledged to implement government-approved, highly effective age assurance measures in keeping with the law. But major sites such as Pornhub, owned by Aylo, continue to mount resistance against similar laws in the EU, recently shutting down access in France in response to its imposition of a standard for age verification to access adult sites.  

AVPA warns sites not to test age checks deadlines 

Iain Corby, executive director of the industry group the Age Verification Providers Association (AVPA), says there are signs that the adult content sector might be mistakenly trying to outlast the OSA.   

“As ever with the adult industry, there has been a last minute rush to implement highly effective age assurance measures, and some are still labouring under the misapprehension that they can wait for a final warning letter from Ofcom before acting,” Corby says in comments emailed to Biometric Update. “The regulator has been crystal clear that if you get a letter from them after 25 July, IT IS TOO LATE – you are already going to be penalized.” He makes a similar point, and delves deeper into the potential consequences, in a recent interview for the Biometric Update Podcast

Corby says he has also seen “evidence of false claims from some sites that they are using the services of our members,” and that AVPA is “preparing a scheme to offer digital certification to websites which do appoint and make use of legitimate age verification providers, so regulators can check efficiently if a site is compliant.” He raises the risk of bogus age assurance: “poor quality or sham age checks being deployed to confound enforcement.” 

Surely, no one wants a scenario in which a shoddy age tool designed to fake out regulators sucks users’ personal information into a shady database. Corby believes that once people see what options are available – and come to understand how they work to preserve privacy – “we will build an invaluable evidence base to demonstrate that convenient, privacy-preserving age checks do not cause the sky to fall in.” 

Luciditi sees interest spike, Yoti asks Ofcom to play fair

Providers express similar optimism. Dan Johnson, chief product officer of Luciditi – an age assurance provider based in Bromsgrove, south of Birmingham, and listed on the UK Digital Identity and Attributes Trust Framework (DIATF) register – says the company has experienced a strong uptick in inbound enquiries for its age verification solution. 

“Our competitors have also recently announced implementations for a variety of clients, demonstrating a desire from providers of age-restricted material to implement highly effective measures to keep this content out of the reach of young people,” Johson says in an email to BU.  

“Following AV Day and over time, we expect the statistics and surveys relating to consumption of age-restricted content online to show a reduction in underage people experiencing adult content and other harmful material. Beyond AV Day, we look forward to continuing to work with Ofcom and the UK Government to innovate and implement further measures to ensure a safe online experience for citizens of all ages while preserving privacy and the rights of adults to access age-restricted content.” 

Robin Tombs is CEO of Yoti, an established UK digital ID company that offers age verification and facial age estimation. In comments sent to Biometric Update, Tombs says “Friday marks a critical moment for online safety in the UK, as platforms become legally accountable under new duties to protect children and tackle harmful content.” 

He encourages Ofcom to be fair and equal with its enforcement powers – i.e., not to penalize some sites while letting others operate without censure. “It is unfair and bad regulatory practice to allow many smaller businesses, and even some bigger businesses, to win market share through non compliance at the expense of the compliant ones,” Tombs says.  

“With enforcement powers in place, it’s vital that Ofcom ensures a level playing field – holding all platforms to the same standards of compliance and transparency. At Yoti, we’ve spent years working with platforms to implement robust, privacy-preserving age assurance technologies. As these new legal requirements come into force, we’re ready to support many more services to meet their obligations in a way that’s secure, scalable and respectful of their user’s privacy.”

Ofcom to industry players: ‘Do you feel lucky, punk?’

In their official press release, Ofcom strikes a stern note, doubling down on its promise to act swiftly in cracking down on violators.   

“Ofcom is ready to enforce against any company which allows pornographic content and does not comply with age-check requirements by the deadline,” the regulator says. “Today we are extending our existing age assurance enforcement programme – previously focused on studio porn services – to cover all platforms that allow users to share pornographic material, whether they are dedicated adult sites or other services that include pornography.”

“We will be actively checking compliance from 25 July and, should it be necessary, we expect to launch any investigations into individual services next week. These would add to 11 Ofcom investigations already in progress.” 

The regulator says it is already seeing commitments from major platforms, with Bluesky, Discord, Grindr, Reddit and X among the latest firms to agree to age-gating. And, “over the last month, the UK’s biggest and most popular adult service providers – including Pornhub – plus thousands of smaller sites have committed to deploying age-checks across their services.”

New task force for high risk content, monitoring program

Should it be shrugged off, Ofcom has readied clubs for potential noncompliance of every size. A new “small but risky taskforce” will focus on sites dedicated to the dissemination of harmful content, including self-harm and suicide, eating disorders or extreme violence and gore.

On the other hand, a new “extensive monitoring and impact programme” will focus on “the biggest platforms where children spend most time – including Facebook, Instagram, Roblox, Snap, TikTok and YouTube.” The would-be All Stars of Noncompliance could be in trouble, since Ofcom’s Codes also “demand that online services act to protect children from dangerous stunts or challenges, misogynistic, violent, hateful or abusive material, and online bullying.” 

Personalized feeds, in particular, are “children’s main pathway to encountering these harms.” 

Yet, “even where sites and apps do not technically allow these types of harmful material under their terms of service, Ofcom’s research shows that such content can be all too prevalent. Our Codes are clear, among other things, that algorithms must be tamed and configured for children so that the most harmful material is blocked.” 

The monitoring and impact programme includes the request for “a comprehensive review of these platforms’ efforts to assess risks to children,” due by August 7. Ofcom will assess these sites and disclose results by September 30, noting in particular “whether they have effective means of knowing who their child users are; how their content moderation tools identify types of content harmful to children; how effectively they have configured their algorithms so that the most harmful material is blocked in children’s feeds; and how they have prevented children from being contacted by adult strangers.” 

The program will also track children’s online experiences to judge whether safety is improving in practice, through ongoing research and consulting with children through the Children’s Commissioner for England. 

Parents support Ofcom efforts to regulate online safety

Ofcom says it has the support of the UK’s parents, with research suggesting that “a majority of parents believe that the measures set out in Ofcom’s Protection of Children Codes will improve the safety of children in the UK.”

“Over three-quarters (77 percent) are optimistic that age checks specifically will keep children safer. Nine in 10 parents (90 percent) agree that it is important for tech firms to follow Ofcom’s rules – but a significant minority (41 percent) are skeptical about whether tech firms will comply in practice.” 

Ofcom Chief Executive Dame Melanie Dawes says “prioritising clicks and engagement over children’s online safety will no longer be tolerated in the UK. Our message to tech firms is clear – comply with age-checks and other protection measures set out in our Codes, or face the consequences of enforcement action from Ofcom.” 

Small businesses worry compliance costs favor mega firms

Of course, there are those who do not support the Online Safety Act. A critique in Raconteur calls it a “security and compliance minefield.” It quotes tech small business owners who complain of complications and costs associated with compliance and delays in deployment. And it notes a point from Jonathan Wright, a partner at legal firm Hunton Andrews Kurth: the OSA “is not just a concern for big tech,” whereas “many mid-size and smaller businesses will need to navigate overlapping regimes for data protection, age verification and content moderation,” with potential reputational, operational and legal risks. 

But it also includes an illustration of how many misunderstand current age assurance technologies, in quoting Jason Nurse, a cyber expert at the University of Kent. Nurse asserts that “these sites will be entrusted with storing large amounts of personally identifiable information from potentially vast segments of the population. How can we be confident this data won’t be misused?” He misses the clear answer: rely on a certified age assurance provider that does not collect or store biometric data or personal information. 

X marks the most common porn spot

Another stakeholder who probably isn’t very high on the OSA is Elon Musk, CEO of X, once Twitter. That’s because, according to the Telegraph, Ofcom has singled out X over the amount of porn available on its platform, which is only restricted to those over 13. 

The piece cites research by Dame Rachel de Souza, the Children’s Commissioner, which found that many children were more likely to see porn on X than on established adult sites: “some 41 per cent of young people aged 16 to 21 reported having seen pornography on Twitter compared with 37 per cent for dedicated adult sites.”

Given X’s annual global revenue, a maximum fine for noncompliance could mean a penalty of up to £200 million, or about $270 million.

Related Posts

Article Topics

 |   |   |   |   |   |   | 

Latest Biometrics News

 

Ring and Flock call off integration as scrutiny of camera-to-police partnership intensifies

Amazon-owned Ring and Flock Safety have canceled their planned partnership, stepping back from an integration that would have linked one…

 

MOSIP pursues democratization of digital identity with unconference conversations

A democratic vision of digital identity is central to the non-profit, open-source mandate of MOSIP. As the organization and the…

 

Liveness is king: FaceTec’s Jay Meier in conversation with Chris Burt 

It’s best, says Jay Meier, to think about identity management as a system of symbiotic systems. Which is to say,…

 

Ofcom fines Kick, threatens 4chan as OSA enforcement steadily dials up

UK regulator Ofcom has faced criticism for being too slow and lenient with its power to enforce the Online Safety…

 

Innovatrics, ROC improve rankings in NIST ELFT, rising to 2 and 3 respectively

Innovatrics is celebrating success in the latest National Institute of Standards and Technology (NIST) Evaluation of Latent Fingerprint Technologies (ELFT)…

 

Meta plans launch of facial recognition to smart glasses in ‘dynamic political environment’

Meta is reportedly planning to roll out facial recognition capabilities for its smart glasses as early as this year, taking…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events