New Ofcom proposal calls for highly effective age assurance for livestreams

Ofcom is determined to make the most of its “Year of Action,” having released yet another set of proposals to strengthen its Codes of Practice. The UK regulator has opened consultation on the document, and given stakeholder until October 20, 2025 to respond.
According to a release, the new measures propose reducing the spread of illegal content by making improvements to recommender systems and crisis response protocols; expanding the use of “proactive technologies,” such as automated CSAM, deepfake and suicide content detection, to block illegal images; and strengthening protections for children by placing restrictions on interactions in livestreams and “making more use of highly effective age assurance” to help protect children from grooming in user-to-user services.
Virality is a new concern: “if illegal content spreads rapidly online, it can lead to severe and widespread harm, especially during a crisis, like the violent riots that followed the Southport murders last year, or if a terrorist attack is livestreamed,” says a blog from Ofcom. “Recommender systems can exacerbate this.”
“If a site or app allows livestreaming, it should have a system which makes it clear to them when a user reports a livestream where there is a risk of imminent physical harm, and have human moderators available at all times to review content and take action in real time.”
Grooming is also a problem that Ofcom hopes to address, in part through the deployment of age verification or age estimation technology. “Under our existing codes, providers should already be taking steps to protect children from grooming,” it says. “Now that we have published our guidance on highly effective age assurance, platforms should use robust age checks to underpin the measures they take to protect children from grooming and harms associated with livestreaming.”
The livestreaming rule brings platforms like Twitch into the age verification debate, joining social media platforms, video gaming hubs and pornographers in the online age verification rumble.
Ofcom’s Online Safety Group Director Oliver Griffiths says the regulator is committed to “holding platforms to account and launching swift enforcement action where we have concerns.”
“But technology and harms are constantly evolving, and we’re always looking at how we can make life safer online. So today we’re putting forward proposals for more protections that we want to see tech firms roll out.”
The full proposal can be downloaded here. Meanwhile, Ofcom is sharpening its blades ahead of July 25, when it formally begins enforcing age assurance requirements in its children’s code.
Article Topics
age verification | biometric age estimation | children | digital trust | Ofcom | regulation







Comments