Australia introduces draft codes on age assurance for adult content sites
It’s about to get harder for underage Australians to watch porn online. Industry groups have released draft codes on age assurance that could prevent adult content sites from appearing in search results and linking on social media unless they implement effective age checks.
The phase two codes firm up existing voluntary codes and standards. The new rules would come into effect in 2025, and would apply to pornography and gambling content across websites, social media, video games, search engines, gaming companies, app developers and internet service providers, among others.
The social media angle puts the onus on platforms to direct porn or self-harm content away from children’s feeds and to filter, blur or otherwise weed out explicit content to the “extent technically feasible and reasonably practical.” It is significant in part because the government is still in the trial phase of age verification for social networks, the outcomes of which will have further implications for the codes.
The codes prescribe no specific method for conducting age assurance to access adult content, meaning there is flexibility in how sites choose to implement age checks. Examples include checking photo ID, facial age estimation, credit card checks, digital ID wallets or systems, or attestation by a parent or guardian. Clicking a button that says you are of age will no longer cut it.
In U.S. markets that have passed similar laws, such as Texas, smut firm Pornhub has chosen the withdrawal method, cutting off access rather than implementing online age checks. It is a concrete illustration of this warning from the Australian eSafety commissioner, which the Guardian obtained via FoI: “No countries have implemented an age verification mandate without issue.”
The draft codes were developed by industry groups the Australian Mobile Telecommunications Association (AMTA), the Communications Alliance, the Consumer Electronics Suppliers Association (CESA), the Digital Industry Group Inc. (Digi) and the Interactive Games and Entertainment Association (IGEA). Submissions on the draft will be accepted until November 22, after which they will be fine-tuned ahead of a December 31 delivery deadline.
“Once finalized, these Draft Safety Codes will make an invaluable contribution to protecting children from online pornography and other harmful content,” says Dr. Jennifer Duxbury, director of policy, regulatory Affairs and research for Digi. “We encourage all stakeholders, including consumer organisations, civil society groups, academics, industry, parents and community members, to have their say on the Draft Codes and provide feedback.”
Age assurance tech exists, but privacy safeguards need priority: ConnectID
ConnectID’s Andrew Black believes that if age restrictions on social media are to work, the focus must be on privacy. In an opinion piece for InnovationAus, Black says although much time has been spent discussing the technology required to create and enforce age verification laws, effective age verification tech actually already exists and is relatively simple.
“The bigger question lies in how we apply it while safeguarding privacy, offering real choice, and preventing new privacy risks associated with verifying users’ ages – in a nutshell, ensuring these individuals are the focus of the solution,” Black says.
He is preaching a sermon that will be familiar to anyone following global age assurance debates: governments and regulators wring hands over how to stop kids from accessing adult content online, while age assurance vendors shout from the sidelines that they’ve been doing this for years.
“With digital ID solutions, we can confirm someone’s age without requiring them to hand over official documents or sensitive information to social media companies,” he writes. Products like ConnectID are “designed to securely verify people’s information – including age – through trusted third parties, like banks, which already hold this data. The platforms can simply plug into a system that uses trusted sources to verify only the necessary details, for example an age assertion of over or under a certain age, rather than exposing an entire ID or date of birth.”
Blacks says the EU GDPR can serve as a model for how to incorporate effective enforcement mechanisms and offer incentives. “The consequences of not complying with the GDPR are significant,” Black says. “For severe violations, fines can be up to €20 million (US$21.6) or 4 percent of global annual turnover, whichever is higher. For the major social media platforms, this can mean fines in the billions of dollars.”
Regular audits could keep platforms in line, while early adopters who set an example could enjoy reduced regulatory burdens or public recognition, to “foster a culture of proactive compliance rather than one focused solely on avoiding penalties.”
For Black, the key point is that while there is work to be done on age assurance regulations and deployment, the tools already exist.
“Governments don’t need to invest taxpayers’ money to create new solutions,” Black says. “There are already a variety of options in the market that can be adapted to meet this demand.”
Article Topics
age verification | Australia | biometrics | children | connectID | digital identity | regulation | social media
Comments