FB pixel

Ofcom releases online safety rules, sets compliance deadline of March 16, 2025

‘The Online Safety Act has come into force,’ says regulator, warning of steep fines
Ofcom releases online safety rules, sets compliance deadline of March 16, 2025
 

Ofcom has published the first major policy statement outlining final guidelines for online platforms under the Online Safety Act. The statement brings into force a law that age assurance providers, global regulators and online service providers have been eyeing closely.

“This decision on the Illegal Harms Codes and guidance marks a major milestone, with online providers now being legally required to protect their users from illegal harm,” says a release from the UK regulator.

An overview document lists expected changes. The very first demonstrates the scope of change Ofcom is aiming for: firms are expected to put “managing risk of harm at the heart of decisions.” This has, to put it mildly, not been a guiding principle of the social media or online pornography industries to this point, and will likely force many to implement stronger age assurance measures and other protections.

Moreover, Ofcom is making it personal. “To ensure strict accountability, each provider must name a senior person responsible for illegal harms, such as terror, hate, and fraud, among many others.”

Protecting children from abuse and exploitation and protecting women and girls are also flagged as areas of concern.

Ofcom has given platforms three months to assess and mitigate risks to kids, or face fines of up to 10 percent of global annual turnover (or up to £18 million, whichever is greater). The formal deadline for compliance to Ofcom’s child safety rules is March 16, 2025, and the regulator is not messing about, saying it is “ready to take enforcement action if providers do not act promptly to address the risks on their services.”

“We can take enforcement action as soon as the duties come into effect, and while we will support providers to comply with their duties, we won’t hesitate to take early action against deliberate or flagrant breaches.”

The regulator says it will also “use our transparency powers to shine a light on safety matters, share good practice, and highlight where improvements can be made.”

While porn sites and social media platforms are likely to get the most attention, the breadth of the rules mean a wide variety of sectors will be affected.

Commentary in Techcrunch says “it’s fair to say that every tech firm that offers user-to-user or search services in the UK is going to need to undertake an assessment of how the law applies to their business, at a minimum, if not make operational revisions to address specific areas of regulatory risk.”

And more requirements are on the way. In comments to the BBC, Ofcom CEO Melanie Dawes says in January, “we’re going to come forward with our requirements on age checks so that we know where children are. And then in April, we’ll finalize the rules on our wider protections for children – and that’s going to be about pornography, suicide and self-harm material, violent content and so on, just not being fed to kids in the way that has become so normal.”

Dawes says this is the “last chance” for the industry to address any problems before facing punitive measures. She has even dangled the possibility of a social media ban for kids in the mold of Australia’s recent legislation, saying that if platforms “don’t start to seriously change the way they operate their services, then I think those demands for things like bans for children on social media are going to get more and more vigorous.”

AVPA looks ahead to age assurance details in January

In response to a LinkedIn post from Ofcom marking the publication, the Age Verification Providers Association (AVPA) celebrated it as a success for the age assurance sector – but not without caution.

“This is a welcome step forward in the process but everyone recognises there is more to do,” says the comment from AVPA, which provided detailed feedback on draft versions of the guidance. The industry association is specifically awaiting policy that explicitly requires age assurance or age checks for creating social media sites, to make sure platforms are adhering to their own age rules.

“That was a fundamental objective of the Online Safety Act, and the focus of a lot of debate in Parliament,” says the post. “If 7-year-olds are still opening accounts with impunity, that will be a major failure.”

Related Posts

Article Topics

 |   |   |   |   |   |   | 

Latest Biometrics News

 

ACCS announces participants in Australia’s Age Assurance Technology Trial

In keeping with its philosophy of transparency by default in running Australia’s Age Assurance Technology Trial, the Age Check Certification…

 

DPI-as-a-Packaged Solution marks major milestone with Trinidad and Tobago rollout

The first ever implementation of DaaS — DPI-as-a-Packaged Solution — is going live in Trinidad and Tobago in a test…

 

AI agents spark musings on identity, payments and wallets

AI agents continue to attract attention, including in the digital identity industry, which sees an opportunity for innovation. Their importance…

 

Trump deregulation is re-shaping the future of biometric surveillance in policing

The advent of AI has exponentially increased the capabilities of biometric tools such as facial recognition, fingerprint analysis, and voice…

 

World expands Android support for World ID credentials

World’s positive relationship with Malaysia continues, with the launch of Android support for World ID Credentials in the country, following…

 

Sri Lanka national data exchange to connect digital ID and public services

A fully developed foundational ID system, including citizen registration, may take 18 to 24 months for Sri Lanka to implement,…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events