FB pixel

In Australia, platforms skirt age laws, but the problem isn’t age assurance tech

Debates circle truth about social media hiding in plain sight
Categories Age Assurance  |  Biometrics News
In Australia, platforms skirt age laws, but the problem isn’t age assurance tech
 

Consider the shovel. A sturdy tool based on an ancient design, a shovel is great for uprooting things. However, if there are poisonous shrubs that need digging up, a shovel is only useful if one elects to use it; and specifically to dig, rather than to, say, brain gophers. In this, a shovel is like any tool – and that includes biometric age assurance technology.

Perspectives on age assurance legislation and implementation continue to clash in Australia, which led the way on social media age restrictions with its Social Media Minimum Age (SMMA) requirement, which came into effect in December 2025. It has become a global hotspot for progress and debate over online age checks, as Big Tech tries its best to maneuver amid incoming regulations.

Pane comes for 2025 Age Assurance Technology Trial with weak sting

The Australian government’s Age Assurance Technology Trial, which published its final report in September 2025, continues to draw scrutiny from a former member of the Stakeholder Advisory Board who resigned over concerns about the trial’s methodology.

A release from non-profit privacy watchdog Electronic Frontiers Australia (EFA) claims that documents newly released under a freedom of information request made by news publication Crikey show that the Office of the Australian Information Commissioner (OAIC) shares EFA’s concerns about “inaccurate claims around privacy compliance” by participating vendors in the AATT’s draft and final reports.

The OAIC’s internal assessments, EFA says, “align substantially with the early warnings and independent observations” by EFA Chair John Pane, the former AATT board member who has publicly accused the trial of being light on substance and prioritizing “political sound bites.”

“These FOI documents validate EFA’s position and its strong concerns about misleading privacy claims made by the AATT, demonstrating that Australia’s chief privacy regulator shared similar concerns about the trial’s methodology and findings on privacy issues,” Pane says.

Pane: AATT did not do things it never intended to do

Pane’s concern was always rooted in a dissatisfaction with the established scope of the trial. Both the government and trial organizers at the Age Check Certification Scheme (ACCS) emphasized throughout that the undertaking was to establish whether or not privacy preserving age assurance was possible in Australia.

Accusing the AATT of “privacy washing,” Pane says the trial set “an incredibly low bar for vendor compliance, bizarrely inferring operational privacy capabilities simply by reading participants’ externally facing privacy policies.” Yet the trial was not a compliance certification, and while vendor statements played a role in evaluations, external testing also took place for top vendors.

Pane decries that “when the AATT Final Report was eventually released, it was predictably cloaked in government-friendly political rhetoric and sound bites, broadly claiming the technology was ‘private, robust, and effective.’ Yet, while comprehensive in page count, the report conveniently excluded fundamental performance indicators from its scope – most notably, the ease with which these technologies can be circumvented by technical means or third-party collusion.”

Circumvention was not within the scope of the trial, which aimed only to establish the possibilities in the context of currently available technology.

“EFA was the first civil society organisation to give the AATT a ‘big red F,’ and the release of the OAIC documents proves that assessment was entirely justified,” Pane says, clearly quite chuffed with himself.

In the documents released under the FOI act, the OAIC says “overarching concerns remain regarding the conclusive references to privacy and language in the report that overstates the privacy evaluation that has taken place in the Australian context.”

Regulation for me, but not for thee

If Pane seems proud of his would-be gotcha, it is surely because it reinforces the position his job requires him to take, and not over his proposed alternative solution. Age assurance laws and age verification technology, he says, are a band-aid on the real problem.

“Instead of pursuing a fundamentally flawed prohibition model, the focus must shift to regulating the platforms themselves,” he says. “We urgently need to break the surveillance-based, data-extractive business models of social media giants. The solution lies in forcing a statutory digital duty of care onto these platforms to protect all users – not just children – from algorithmic manipulation and digital surveillance, while simultaneously uplifting digital civics and online safety education for primary and secondary school students.”

Again, we have an argument that says age assurance asks for too much data (proof of age) in order to keep kids off the social media platforms that have created and shaped our data-ravenous culture. Kids shouldn’t have to give away a secret, EFA says, to gain access to the secret stealing machine.

Moreover, his suggestion that, instead of regulating social media platforms with age checks, the government should establish a digital duty of care, is noble but naive, and demonstrates the contradiction in Pane’s argument. While he proclaims the need to break the “surveillance-based, data-extractive business models of social media giants,” he also suggests we can fix them. Put some proper laws on the Silicon Valley types, the thinking goes, and they’ll shape up. Besides, we can teach the kids to be smart online, can’t we?

This is tantamount to suggesting that the real way to stop kids from smoking is to force tobacco companies to make their product less addictive and harmful to human health. But we know now that cigarettes are inextricably tied to their addictive quality; addiction is, arguably, the main product. The answer is not for society to pursue a healthier cigarette. Indeed, the UK government has taken a different path: under the incoming generational tobacco ban, no one born after 2009 will be able to purchase cigarettes. The idea is to phase the habit out of memory through a moving age threshold.

In conclusion, then, Pane’s boastful nose-thumbing at the AATT does contain the kernel of a useful idea. If he is serious about his concerns, the EFA must begin a comprehensive campaign pushing Australia to phase out existing large social media platforms entirely. Otherwise, Pane and his peers will spend their days chasing an oxymoron in “privacy preserving social media.”

Laws ineffective when companies break them

Another of the EFA’s arguments can be stated as follows: age assurance in Australia isn’t working, because the social media platforms aren’t doing it. Indeed, data from a recent survey shows that, of the total number of users under 16 who had accounts on Instagram, Snapchat and TikTok before the prohibition, 70 percent maintained access. This suggests early reports of takedowns in the tens of millions are probably overblown.

The government, however, is not the one filing the numbers; that would be the social media sites, which are supposed to be following the law. A report from the Guardian quotes Australia’s communications minister, Anika Wells, who says the data is “evidence of the absolute bare minimum from social media companies,” who benefit from reporting that says the law isn’t working.

“They want you all to report that the laws are failing. That helps them in their quest to reduce regulation, to minimize it the world over. So I’m not surprised by any of this. We expected this.”

OAIC launches public consultation on Children’s Online Privacy Code

The government, to its credit, appears committed to careful assessment of how its law is playing out in real time. The eSafety Commissioner has already launched a major evaluation to take place over multiple years.

Now, the OAIC is seeking public input on an exposure draft of its Children’s Online Privacy Code, which a release says includes new rules requiring agencies and organizations to “ensure they consider the best interests of children before collecting, using or disclosing their personal information.”

Unlike the Social Media Minimum Age Act, the code applies broadly across online services where children are likely to face the highest risk. That implicates “most apps, games and websites that children and teenagers use daily, as well as online services that are primarily concerned with the activities of children.”

Under the code, direct marketing is only permissible with consent, when in the child’s best interests and when the personal information is collected directly from the child (rather than third parties); kids get a right to request destruction of their personal information; and privacy notices and policies must be written in clear, accessible, age-appropriate language.

The code draws from like-minded international legislation such as the UK’s Age Appropriate Design Code, but includes “some novel protections” geared toward kids. One is a requirement that online services notify children when their parents consent to the collection of personal information on their behalf. Another notifies them when other users (including parents) are tracking their geolocation.

Public consultation on the draft code runs for 60 days, until June 5, 2026. Kids, parents and carers, industry and civil society and other interested parties are invited to offer input.

“It has been estimated that by the time a child turns 13, around 72 million pieces of data will have been collected about them, making them vulnerable to harms from data breaches, discrimination, algorithmic bias and targeted advertising of harmful products, amongst other risks,” says Privacy Commissioner Carly Kind.

The code, she says, won’t restrict or limit young people’s participation in online spaces. “Instead, it raises the standard for privacy protections in Australia and puts the onus on online services to do better when handling children’s personal information online.”

The OAIC recently participated in the “global privacy sweep” with other data protection and privacy authorities from around the world, examining almost 900 websites and apps used by children. The sweep found that the amount of data being collected on a regular basis has increased since 2015.

Related Posts

Article Topics

 |   |   |   |   |   | 

Latest Biometrics News

 

Roblox on the hook for $12.5M in Nevada online child safety settlement

Roblox has agreed to pay $12.5 million to the State of Nevada, in part of a settlement deal that “resolves…

 

Scaling Nepal’s digital economy will require interoperable identity infrastructure

Two recent studies on Nepal’s digital transformation paint a picture of robust growth but somewhat fragile foundations. Nepal’s digital economy…

 

Precise claims 100K biometrics, cybersecurity uses per second, bright future

Precise Biometrics closed what CEO Joakim Nydemark calls “a challenging year” with a positive EBITDA and cashflow. The company netted…

 

Voice morphing attack blends identities to bypass voice biometrics: study

A new research paper explores a signal-level approach to voice morphing attacks that exposes vulnerabilities in biometric voice recognition systems….

 

Mitek, Synectics partner on IDV, fraud intelligence integration for insurers

Mitek and Synectics Solutions have formed a new partnership to help UK insurers identify fraud earlier in digital insurance applications…

 

FBI report reveals cybercrime losses hit $20B high with phishing, spoofing dominant

Cybercrime losses have risen significantly, surpassing $20 billion, while phishing and spoofing is the dominant cyber-enabled fraud activity, reports the…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events