Global movement coalescing around age verification and its role in online safety

Europe, the UK and Australia are embarking on further mechanisms to improve online safety for all, with a focus on younger users. A European pilot has seen success for a way to let age verification follow web users via cookies, Google is introducing age verification in Australia following legislation and the team involved in the preparation of the forthcoming – and ground-breaking – UK Online Safety Bill take a moment to reflect on the Bill’s scope.
Successful European trial of interoperable age verification tech
The euCONSENT project has successfully completed its first large-scale trial for interoperable online age verification in Europe where participants saved a browser cookie after verifying their age with a provider for one website to then use at a different website.
Iain Corby, executive director of the Age Verification Providers Association, and project manager of euCONSENT told the Privacy Paths podcast from Privacy Laws & Business that the approach – an extension to the eIDAS framework – will roll out this summer.
The pilot operated across five countries and involved 1,600 adults and children showed that the tech worked, according to Corby, and that euCONSENT is working with the IEEE, the ISO campaign group 5Rights to make sure the process is understandable for children.
An initial report on the pilot is now available on the euCONSENT website with full details to follow. The report explains how participants in Greece, UK, Germany, Cyprus and Belgium had to interact with dummy websites to test age verification interoperability.
Corby also described in conversation with Privacy Laws & Business’s Stewart Dresner how facial age estimation technology based on biometrics which is in use by the Age Verification Providers Association has a mean average error rate of 2.8 years for users between six and 60 years old, and this drops to 1.3 years for six- to 12-year-olds and 1.55 years for 13- to 19-year-olds, meaning for the key ages there is only a year and a half average error rate for users “simply by asking for a selfie.”
Facial age estimation software is now 99 percent accurate at stopping under-18s accessing a site if the bar is set for 25 years of age.
Corby said that the association is helping to “take what have been the norms of society for many decades and apply those online” as one would not expect to see a child walking into a strip club or bookies on the high street. A discussion on the UK’s upcoming Online Safety Bill covered very similar topics.
‘Misunderstood,’ ‘brain drain’ and ‘children can’t wait’ as UK Online Safety Bill awaits parliament
A panel discussed their delight, frustrations and hopes for the UK’s revised Online Safety Bill ahead of it beginning the passage through Parliament, speaking at the two-day AI UK event organized by The Alan Turing Institute.
After the revised Online Safety Bill was published, representatives involved in its drafting and future regulation discussed its implications and acknowledge that while it felt like the end of the process, “really it is just the beginning as it undergoes democratic scrutiny,” said Katie Morris, head of Online Safety Regulatory Framework at the Department of Culture, Media and Sport.
Yoti offered a summary of the initial draft of the Bill when it was first published.
Damian Collins, chair of the Online Safety Bill Committee and a member of parliament, welcomed the Bill’s requirements of online platforms to proactively search for harmful content rather than allowing their systems to amplify it. However, he finds the Bill’s language of ‘legal but harmful content’ to be problematic.
Katie Morris finds the Bill’s third objective – to tackle ‘legal but harmful content’ – to be misunderstood. She hopes that the requirement for companies to undertake risk assessments and provide clear terms of service will clarify the issue.
Baroness Beeban Kidron, House of Lords crossbench peer and chair of the 5Rights Foundation, regrets that the Bill does not include more of the Joint Committee’s recommendations, that parts may not be enforceable, that it does not link with the UK’s Age Appropriate Design Code and that the regulator Ofcom was still planning to conduct more research into the online harms to children despite plenty of research already existing.
Kidron was delighted about the age verification requirement for accessing pornography, but critical of the fact it does not extend to other types of adult material. It also does not include provision against dark patterns and dissemination – for children it only handles content rather than the method.
“What would you do differently if you knew the end-user was a child?” is the question Kidron wants the Bill to impose on providers. “How do we make sure that question is asked and answered?”
A political mechanism in the Bill means that it will be Parliament, and not the Secretary of State, which will be empowered to update the priority harms handled by the Bill, a move welcomed by Collins. Recommendation engines are going to be “front and center” for this regulation, according to the MP, as most of what people watch on YouTube, for example, is via recommendation. If the AI is not up to keeping users safe, more human moderators may be needed.
Transparency notices are to become a duty for online providers. They will be required to report to Ofcom, the regulator, what measures they have taken towards harm reduction.
“For the first time Ofcom is going to be looking over your shoulder,” said Anna-Sophie Harling, Online Safety Principal at Ofcom. “It’s not just the Bill, but what the regulator does with those powers.”
Ofcom is creating a large team in Manchester to handle the upcoming regulatory requirements so rapidly that panelists claimed there was an industry brain drain to the agency.
But Kidron believes there is something missing for transparency reporting. While there will be reporting from the platform to the regulator, there will not be the same requirement from platform to user. Morris said the requirements for terms of service will handle this.
While Collins believes the mechanisms within the Bill mean Parliament will be better able to keep it up to date, Kidron wants more urgency now and believes the government needs to fast track some of the legislation: “the children can’t wait.”
Google to require age verification following Australian federal declaration
Users of YouTube and Google Play in Australia will have to verify their ages to access adult content following the federal government’s Australian Online Safety (Restricted Access Systems) Declaration 2022, reports iTnews.
Users seeking to view adult material may be required to upload an image of their government ID or credit card, according to a Google blog referenced by ITnews.
“If our systems are unable to establish that a viewer is above the age of 18, we will request that they provide a valid ID or credit card to verify their age. We’ve built our age-verification process in keeping with Google’s Privacy and Security Principles,” states the blog.
The valid ID being driver’s licenses, proof of age cards or passports. ID documents will be deleted after verifying a date of birth.
Article Topics
access management | age verification | Australia | biometrics | children | digital identity | euCONSENT | Europe | face biometrics | facial analysis | legislation | Online Safety Bill | regulation | UK
Comments