Privacy, security, speech top concerns in statements on Australia’s search code

Having issued a call for submissions regarding the implementation of Australia’s Internet Search Engine Services Online Safety Code, which includes age verification rules, the Environment and Communications References Committee has published the submitted statements.
The submissions, of which there are 31 in total, come from a variety of organizations and stakeholders, from advocacy groups to social media firms to the eSafety Commissioner. In browsing the statements, the messaging aligns much as one might expect. Digital rights groups object to age verification, claiming it is incompatible with digital privacy. Government agencies want to protect the children. The adult industry is worried about free speech. Social media companies say they want to protect the children, but think it should be someone else’s problem, and object to the incoming restriction on social media accounts to individuals over 15.
The concerns: privacy, censorship, ineffectiveness
The Australian Research Council Centre of Excellence for the Digital Child says that “the use of age assurance tools raises ethical and legal considerations around consent as they may contravene UNCRC principles relating to the collection of data or metadata from children.”
Australian charity Digital Rights Watch goes further, claiming that “the introduction of age verification for online content raises profound concerns about privacy, data protection, and proportionality. Invasion of privacy is inherent in any system that requires individuals to prove their age before accessing certain material.”
In its submission, the Eros Association, which represents Australia’s adult content industry, focuses on the perceived threat to freedom of expression. “The ISES Online Safety Code and other Online Safety Codes must not become de facto censorship tools that restrict essential health and wellbeing resources or undermine lawful access for adults.” Regarding search specifically, it worries that “decisions about what Australians can access online risk being outsourced to multinational search engines.”
Scarlet Alliance, which represents Australian sex workers, says age assurance technologies “are not fit-for-purpose,” and that “mandating under-tested technology generates significant privacy and data security concerns.”
X, Elon Musk’s rebranded Twitter on which he regularly proselytizes about free speech, advocates for “a balanced approach which protects children without compromising their privacy, freedom of expression and access to information.” But it has “serious concerns as to the lawfulness of the Social Media Minimum Age, including its compatibility with other regulations and laws, including international human rights treaties to which Australia is a signatory.”
A joint statement from the Australian Child Rights Taskforce, ChildFund Australia and Dr. Rys Farthing of the University of Canberra says the current “age-assurance-first” approach is backwards and “misaligns the problem with the solution.”
“The impetus for both of these regulations is the historically poor safety practices that abound across digital platforms and search engines, but the target for the solution is end users. That is, the key problem is an industry failing and the key actors are industry, however instead of addressing industry directly, this remedy requires individual Australians users to age-assure.”
They argue for “an overall safety-settings first approach” that aligns with the EU’s Digital Services Act.
The Internet Association of Australia calls out the government for moving ahead with legislation “without sufficient consultation or evidence” – and asks, specifically, why the development of the laws “took place alongside the Age Assurance Technology Trial as led by the Department of Infrastructure, Transport, Regional Development, Communications and the Arts, prior to the Trial being completed or its findings fully assessed.”
Dr. Angie Simmons, founder of Hashtag.AI, raises the difficulty of regulating decentralized networks: “For example, consider the difficulty of introducing age verification for Mastodon, which is open-source software that allows anyone to self-host their own social media network, and to connect this with the rest of the ‘fediverse’ through the ActivityPub protocol.”
The defenses: give kids the safety they’re asking for
The assessments, of course, are not all bad.
In its statement, the Department of Infrastructure, Transport, Regional Development, Communications, Sport and the Arts hails age assurance as an important part of an overall solution. “If deployed proportionately and effectively, age assurance provides a key opportunity to enhance the fundamental rights of children in a digital age, by reducing their exposure to harmful content and experiences.” it says. “At the same time, age assurance can help protect and preserve the freedom of adults to enjoy online goods, content and services.”
The Australian Information Industry Association (AIIA) supports the development of “clear guidelines on what constitutes acceptable age assurance under the Phase 2 Industry Codes.”
“We urge that any guidance or standards remain principles-based and flexible, allowing providers to use a range of age assurance tools (and combinations thereof) that best fit their service model and risk level. Importantly, this approach will encourage innovation and improvement in age assurance technologies over time, rather than locking industry into a narrow solution. Providers should be empowered to adopt methods proportionate to their service’s risk profile.”
The submission from the NSW Advocate for Children and Young People, Zoë Robinson, raises an interesting and under-discussed point: some of the call for regulations on social media is coming from kids themselves. “The better regulation of online content, and the promotion of more positive content and mental health resources is something young people have been asking for and is a significant step forward in making the online world safer for children and young people. Some of the themes highlighted by children and young people in the 2024 Social Media Summit and the Youth Week Polling data include concerns around harmful exposure to pornography, disordered eating content and fake news.”
The Alannah & Madeline Foundation, a charity focused on preventing violence against children, welcomes “the creation of codes for industry to help prevent and reduce children’s exposure at a systems level, as we believe many risks are rooted in the design and functioning of digital products and services themselves.”
Moreover, it is “encouraged by the finding of the Age Assurance Technology Trial that age assurance can – from a technical perspective – be done privately, effectively and efficiently in Australia.” It aligns with Dr. Rys Farthing in advocating for a safety-by-default approach, with age assurance only applied as a secondary measure for those actively seeking adult content.
Spring of hope or winter of despair?
Age assurance is a boon; age assurance is a threat. Age checks protect kids; age checks chill free speech. Age verification puts porn back on the top shelf; age laws that treat sexual content as inherently harmful marginalize sex workers. It was the best of tech; it was the worst of tech.
The debate around age assurance continues apace, as the tech makes its way into the mainstream. Many are certain of their position, and many others are equally certain of the opposite. Perhaps the most salient submission comes from MP Nicolette Boele, who, on behalf of her constituents, pleads bafflement.
“However noble the ambition of the policy, the lack of detail regarding the proposed implementation is creating confusion and concern, while providing a vacuum for misinformation,” she says. “The number of questions, speculation and general concern about the utilization of age verification and content filtering as a means of broadly managing online access suggests to me that the Government should be undertaking an extensive public education campaign well ahead of any community rollout.”
Lab conducts independent testing of age estimation as porn visits plummet
One thing, at least, is becoming clear: age checks, in the form of biometric age estimation or age verification, discourage people from watching porn – at least on the biggest sites. According to ABC News, visits to Pornhub plummeted by nearly 50 percent (more than a million visitors) in the UK following the introduction of its Online Safety Act in July. Competitor Xvideos was down 47 percent in the same period.
That doesn’t necessarily mean people aren’t watching porn. A statement from Pornhub says that, “as we’ve seen in many jurisdictions around the world, there is often a drop in traffic for compliant sites and an increase in traffic for non-compliant sites.”
The issue of VPNs remains pressing, as usage of the IP-occluding technology skyrockets in proportionate measure to Pornhub’s fall in user traffic. More concerning, however, are findings from a separate ABC report, which claims leading facial age estimation tools were “easily fooled by a twenty-two dollar ‘old man’ mask, a Guy Fawkes mask, and other cheap party costumes.”
“[We tried] the funny moustache, big fat nose and glasses, a Guy Fawkes mask, happy face, sad face, things like that,” says Professor Shaanan Cohney of the University of Melbourne, which is running the study with Princeton.
“Every age assurance vendor that we tested had one bypass that was easily accomplished with things that you could buy at your local $2 shop.”
The study only tested three facial age estimation systems, which are not named in the report. Iain Corby, executive director of the Age Verification Providers Association, notes that the test also used more realistic masks, that age assurance is not perfect (only close) and that the latest models may have performed better.
The University of Melbourne study is ongoing and the results are only preliminary. “We’re not going to make any strong claims about that until the data is actually in,” Cohney says.
The professor has previously dismissed the final report of the Australian Government’s Age Assurance Technology Trial, saying it “understates risk, overstates effectiveness, and falls well short of the standard security and privacy researchers expect for a high-stakes, society-wide intervention.”
Article Topics
age verification | Australia | Australia age verification | biometric age estimation | facial age estimation (FAE) | Online Safety Act (Australia)






Comments