FB pixel

Australian regulators roll out new age assurance guidance, enforceable industry codes

Ongoing campaign aims to align youth access to virtual, physical spaces
Categories Age Assurance  |  Biometrics News
Australian regulators roll out new age assurance guidance, enforceable industry codes
 

The Office of the Australian Information Commissioner (OAIC) has published new guidance on age assurance technologies. A release from the OAIC says that, with the significant spike in the number of age checks being performed following Australia’s Social Media Minimum Age Act and other online safety codes, it is intended to help parties “work through the privacy issues associated with choosing and implementing age assurance methods.”

According to Privacy Commissioner Carly Kind, the guidance clarifies the OAIC’s expectations, emphasising necessity and proportionality, transparency, effective complaints mechanisms and strong vendor controls. It calls for online services to make sure they need to be checking their users’ ages, adopt privacy by design, “undertake due diligence to ensure the security of the entity’s age assurance ecosystem,” and “assess risk and choose age-assurance methods that are proportionate and data minimising.”

Clear consent must also be established for collection of biometrics and other sensitive personal information.

Kind says “age assurance is not a blank cheque to use personal or sensitive information in all circumstances and must not erode Australians’ privacy rights.”

“Entities need to stop and think about the goals of performing an age check, whether it is even necessary in the first place, and ensure strong governance across the ecosystem. And offering individuals transparent, data-minimizing options to validate their age is important if entities want to use these technologies as a gateway to age-appropriate experiences online”.

New codes aim to restrict StripChat like strip clubs

In tandem, the Office of the eSafety Commissioner – occasionally the favorite villain of Silicon Valley – continues to make a case for the necessity of online age assurance, and to roll out additional safety codes.

A post on LinkedIn lays out the oft-cited physical analogue: “a child cannot walk into a bar and order a drink. They cannot stroll into a strip club, browse an adult shop or sit down at a blackjack table in a casino.” Yet, online, “a child who could never enter a physical adult venue can, within seconds, access pornography more extreme than anything sold behind the counter of a bricks-and-mortar adult store. They can stream high-impact violent footage far worse than anything they might see by sneaking into an R-rated movie.”

Society, says eSafety, has “tacitly allowed the growth of a digital world that ignores principles and rules we enforce in the physical one.”

The new Age Restricted Material Codes aims to address this problem. “The idea behind these codes is that these principles should be consistent: environments designed for adults should not be freely accessible to children, regardless of whether they are built with bricks or computer code.”

The enforceable industry codes “cover most corners of the online ecosystem, from device manufacturers, gaming services and app stores to social media, messaging services, generative AI systems, websites and search engines.”

They “require the entire online industry to put in place meaningful protections preventing children’s exposure to content they are not ready to see,” including extreme violence, pornography, self-harm, suicide and disordered eating content.

The Commissioner’s statement is neutral on the precise methods companies use to conduct privacy-preserving age checks – noting that Aylo, which helped author the codes, has switched to a pay-for-NSFW content model in Australia – but expresses confidence that “companies have the capability to develop and deploy the technologies that protect children from age-inappropriate material and still allow adults to access this content.”

“For years, the dominant and often self-serving narrative has been that digital spaces are too complex, too global or too technologically fluid to regulate in the same way as physical ones. But complexity should never be used as an excuse for inaction.”

Related Posts

Article Topics

 |   |   |   |   | 

Latest Biometrics News

 

India scales farmer ID system for payments with KPMG support

The India office of influential accounting firm KPMG has explained how it supported the advancement of the country’s Digital Agriculture…

 

Digital ID systems fail migrants due to policy gaps, Caribou finds

A new report by research organization Caribou has warned that digital ID systems around the world have continued to deepen…

 

Hopae launches eIDAS 2.0, AMLR onboarding readiness tool

Hopae has launched a free self-assessment tool to help financial institutions offering customer onboarding and identity verification to evaluate their…

 

Certainty vs flexibility – does the UK need a Biometric Surveillance Act?

By Professor Fraser Sampson, former UK Biometrics & Surveillance Camera Commissioner Last week London became a city of two tales. Two…

 

TestMu AI releases testing tool for agent-produced code

TestMu AI (formerly LambdaTest) has launched Kane CLI, “a new browser automation tool that runs directly from the terminal,” and…

 

Travel biometrics making new connections

Airport biometrics projects and companies are breaking new ground and intersecting with other industry trends, from digital wallets to biometric…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events