UK ‘super-complaints regime’ to super-charge Ofcom’s online safety enforcement

Complaints are one thing, but violators of the UK’s Online Safety Act are about to meet super-complaints. The Department for Science, Innovation & Technology (DSIT) has published a consultation outcome addressing the government stance on “super-complaints eligible entity criteria and procedural requirements.”
A Ministerial Foreword from Baroness Maggie Jones explains that “the objective of the super-complaints regime is to ensure that eligible entities can make complaints to Ofcom, as regulator, to make them aware of existing or emerging online harms. This will also support Ofcom’s horizon scanning function, supporting Ofcom in taking an agile approach to regulating online harms.”
Ofcom has been showing its regulatory agility (and teeth) recently, pursuing investigations against several websites it believes may be operating in violation of the Online Safety Act and its children’s safety codes. This week, it announced the launch of nine new investigations – seven into child sexual abuse imagery on file-sharing services, one into porn provider First Time Videos LLC, and one against notorious online cesspool 4chan.
The action against First Time Videos LLC involves the requirement to have highly effective age assurance in place for adult content websites.
The others are prompted by “complaints about the potential for illegal content and activity on 4chan, and possible sharing of child sexual abuse material on the file-sharing services.”
Formally, the investigation will focus on whether platforms have appropriate safety measures in place to protect UK users from illegal content and activity; whether they complete – and keep a record of – a suitable and sufficient illegal harms risk assessment; and whether they respond to repeated statutory information requests.
Ofcom will judge complaints on case-by-case basis against criteria
DSIT says it is important that the super-complaints regime will “ensure that eligible entities can also make Ofcom aware of any action taken by regulated services which is significantly adversely impacting users’ rights to freedom of expression.” Whereas some similar models establish a list of approved entities, Ofcom will operate on a case-by-case basis, weighing each complaint and complainant against eligibility requirements.
DSIT’s consultation, which included six criteria for eligibility, was more or less well received: “Respondents generally accepted our rationale for a case-by-case assessment of an organisation’s eligibility to submit a super-complaint, rather than pre-designating a list of complainants.” To address some concerns about rigidity, the criteria have been cut to four:
First, “whether the entity represents the interests of users of regulated services, or members of the public.” Second, “the composition of the entity and the arrangements for its governance and accountability, such that it can be relied upon to act independently from regulated services.” Third, “whether the would-be complainant is recognised as an ‘expert voice’ in relation to online safety matters and routinely contributes significantly, as an expert, to public discussions about online safety matters. And, finally, “the entity can be relied on to have due regard to any guidance published by Ofcom.”
Other issues in the consultation include evidentiary requirements, a 30-day pre-notification requirement that has been removed, limitations on submissions, and what Ofcom is obligated to do when it receives and processes a complaint. Overall, it has tightened response timelines, and expanded eligibility criteria to “include newer organisations that are experts in online safety matters, not just ‘experienced’ ones that have a track record of publishing high-quality research and analysis or collaborating with other organisations – a crucial change which recognises the value that entities of all experience levels have in the online safety sphere.”
Generally, DSIT says respondents were broadly supportive of the super-complaints policy proposals, and disagreements generated constructive feedback that led to policy changes. “In general, the feedback argued that any complex process for the submission, assessment and response to complaints potentially allowed real-life, real-time harms to continue unchecked and unchallenged. Others felt that the proposals leaned more towards managing Ofcom’s work and resources than enabling valid super-complaints from a range of different authors.”
Regardless, DSIT believes super-complaints are “an important part of the new framework, and will allow organisations to advocate for users, including vulnerable groups and children, to ensure issues affecting UK users or members of the public are brought to Ofcom’s attention.”
Kyle almost ready to show big package to the world
If it feels like the screws are tightening on online regulation, that’s not just because Ofcom is chucking out investigations like ninja stars. Peter Kyle is set to announce what the Telegraph calls “a package of potential new online safety measures” – and is said to be considering imposing a two-hour cap on social media for youth, and a possible 10pm cutoff, to combat screen addiction and doomscrolling.
Kyle says he is “trying to think how we can break some of the addictive behaviour and incentivise more of the healthy developmental, and also the good communicative side of online life.”
Not to say Ofcom is far from Kyle’s mind, or his strategy. He notes that “in July, age-appropriate material must be supplied by platforms, otherwise there’ll be criminal sanctions against them.”
Kyle has previously proposed a nighttime curfew on social media, and there is also speculation about restrictions during school hours. All will be revealed in time: Kyle’s “package of measures” will get its money shot “in the not too distant future.”
Article Topics
age verification | biometrics | children | Department for Science Innovation and Technology (DSIT) | digital identity | Ofcom | Online Safety Act | regulation | social media | UK age verification
Comments