Ofcom’s new Children’s Safety Codes require “highly effective age checks”
Ofcom has dropped the proverbial hammer on age-sensitive internet services, releasing an update that sets out more than 40 practical steps businesses must take to keep children safer, including robust age verification.
A release from the UK’s communications regulator says the draft Children’s Safety Codes of Practice requires firms to first assess the risk their service poses to children, and then implement safety measures necessary to prevent kids from encountering any content relating to suicide, self-harm, eating disorders or pornography. Social media sites, apps and search engines will be required to conduct “robust age-checks” for those accessing content on those topics, and are also expected to minimize children’s exposure to violent and hateful content, including abusive material, online bullying, and anything promoting dangerous challenges.
“We want children to enjoy life online” says a statement from Ofcom Chief Executive Dame Melanie Dawes. “But for too long, their experiences have been blighted by seriously harmful content which they can’t avoid or control.” Dawes says the proposed Codes are in line with new online safety laws in making tech firms responsible for children’s safety. “They will need to tame aggressive algorithms that push harmful content to children in their personalized feeds and introduce age-checks so children get an experience that’s right for their age.”
Dawes says Ofcom’s measures go beyond current industry standards, and that once they have been presented to parliament for approval next spring and are in full force, the regulator will not hesitate to use them to hold platforms to account.
Age assurance must be “technically accurate, robust, reliable, and fair”
For biometrics and digital identity firms, the key language concerns the requirement for online services to “implement highly effective age-checks” to prevent children from seeing adult content. Ofcom expects “much greater use of highly-effective age-assurance.” As to what that means exactly, a footnote specifies that tools “must be technically accurate, robust, reliable, and fair. Examples of age assurance methods that could be highly effective if they meet the above criteria include photo-ID matching, facial age estimation, and reusable digital identity services.”
For many experienced biometrics providers, this is a made-to-order situation. In recent testimony to California’s Assembly Judiciary Hearing on bill 3080 – also known as the Parent’s Accountability and Child Protection Act – Iain Corby of the Age Verification Providers Association (AVPA) says “online age verification can be done and indeed is already being done at scale, anonymously, effectively, inclusively, conveniently and cheaply.”
Age verification providers, says Corby, “can and should be independently audited against approved international standards. They will check that the results are not only accurate, but also that any data used in the process is processed securely and not retained after the check is completed, with heavy penalties for retaining personal data illegally.” Corby notes that all of this can be achieved for around 12 cents per check.
Robin Tombs, CEO of Yoti, which is among the AVPA’s members, says years of providing privacy-preserving, reliable and effective age assurance to social media, adult and dating sites has given his digital ID firm a broad perspective on the evolution of age assurance requirements.
“There is no one silver bullet when it comes to child safety, but effective age checking will be an essential part in protecting children from accessing harmful content online,” says Tombs. He also notes the importance of offering people a choice in how they prove their age, to ensure inclusivity and accessibility. “Thankfully Ofcom has recognized that facial age estimation, reusable digital identity services and photo-ID matching are all highly effective solutions.”
Among Yoti’s clients is the UK-based adult content creator platform OnlyFans, which is currently the subject of an investigation by Ofcom. At issue was a matter of client-controlled age threshold configurations – suggesting that, even with turnkey tools, businesses may struggle to achieve compliance across the board, at least without help.
Article Topics
age verification | AVPA | biometrics | children | digital identity | face biometrics | Ofcom | Online Safety Act | Yoti
Comments