FB pixel

Ofcom releases online safety rules, sets compliance deadline of March 16, 2025

‘The Online Safety Act has come into force,’ says regulator, warning of steep fines
Categories Age Assurance  |  Biometrics News
Ofcom releases online safety rules, sets compliance deadline of March 16, 2025
 

Ofcom has published the first major policy statement outlining final guidelines for online platforms under the Online Safety Act. The statement brings into force a law that age assurance providers, global regulators and online service providers have been eyeing closely.

“This decision on the Illegal Harms Codes and guidance marks a major milestone, with online providers now being legally required to protect their users from illegal harm,” says a release from the UK regulator.

An overview document lists expected changes. The very first demonstrates the scope of change Ofcom is aiming for: firms are expected to put “managing risk of harm at the heart of decisions.” This has, to put it mildly, not been a guiding principle of the social media or online pornography industries to this point, and will likely force many to implement stronger age assurance measures and other protections.

Moreover, Ofcom is making it personal. “To ensure strict accountability, each provider must name a senior person responsible for illegal harms, such as terror, hate, and fraud, among many others.”

Protecting children from abuse and exploitation and protecting women and girls are also flagged as areas of concern.

Ofcom has given platforms three months to assess and mitigate risks to kids, or face fines of up to 10 percent of global annual turnover (or up to £18 million, whichever is greater). The formal deadline for compliance to Ofcom’s child safety rules is March 16, 2025, and the regulator is not messing about, saying it is “ready to take enforcement action if providers do not act promptly to address the risks on their services.”

“We can take enforcement action as soon as the duties come into effect, and while we will support providers to comply with their duties, we won’t hesitate to take early action against deliberate or flagrant breaches.”

The regulator says it will also “use our transparency powers to shine a light on safety matters, share good practice, and highlight where improvements can be made.”

While porn sites and social media platforms are likely to get the most attention, the breadth of the rules mean a wide variety of sectors will be affected.

Commentary in Techcrunch says “it’s fair to say that every tech firm that offers user-to-user or search services in the UK is going to need to undertake an assessment of how the law applies to their business, at a minimum, if not make operational revisions to address specific areas of regulatory risk.”

And more requirements are on the way. In comments to the BBC, Ofcom CEO Melanie Dawes says in January, “we’re going to come forward with our requirements on age checks so that we know where children are. And then in April, we’ll finalize the rules on our wider protections for children – and that’s going to be about pornography, suicide and self-harm material, violent content and so on, just not being fed to kids in the way that has become so normal.”

Dawes says this is the “last chance” for the industry to address any problems before facing punitive measures. She has even dangled the possibility of a social media ban for kids in the mold of Australia’s recent legislation, saying that if platforms “don’t start to seriously change the way they operate their services, then I think those demands for things like bans for children on social media are going to get more and more vigorous.”

AVPA looks ahead to age assurance details in January

In response to a LinkedIn post from Ofcom marking the publication, the Age Verification Providers Association (AVPA) celebrated it as a success for the age assurance sector – but not without caution.

“This is a welcome step forward in the process but everyone recognises there is more to do,” says the comment from AVPA, which provided detailed feedback on draft versions of the guidance. The industry association is specifically awaiting policy that explicitly requires age assurance or age checks for creating social media sites, to make sure platforms are adhering to their own age rules.

“That was a fundamental objective of the Online Safety Act, and the focus of a lot of debate in Parliament,” says the post. “If 7-year-olds are still opening accounts with impunity, that will be a major failure.”

Related Posts

Article Topics

 |   |   |   |   |   |   |   | 

Latest Biometrics News

 

Biometrics providers and systems evolve or get left behind

Biometrics are allowing people to prove who they are, speeding journeys through airports, and enabling anonymous online proof of age,…

 

Findynet funding development of six digital wallet solutions

Finnish public-private cooperative Findynet has announced it will award 60,000 euros (US$69,200) to six digital wallet vendors to help translate…

 

Patchwork of age check, online safety legislation grows across US

As the U.S. waits for the Supreme Court’s opinion on the Texas case of Paxton v. Free Speech Coalition, which…

 

AVPA laud findings from age assurance tech trial

The Age Verification Providers Association (AVPA), and several of its members, have welcomed the publication of preliminary findings from the…

 

Sri Lanka to launch govt API policies and guidelines

Sri Lanka’s government, in the wake of its digital economy drive, is gearing up to release application programming interface (API)…

 

Netherlands’ asylum seeker ID cards from Idemia use vertical ICAO format

The Netherlands will introduce new identity documents for asylum seekers Idemia Smart Identity, compliant with the ICAO specification for vertical…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events