Australia gives companies 6 months to draft online child-safety codes

In a bid to safeguard children from graphic and harmful online content, Australia’s eSafety Commissioner has issued directives to “key” players in the online industry. These companies now have six months to establish enforceable codes designed to shield children from exposure to explicit pornography and other high-impact materials. The codes will have to include age assurance methods. At the same time, the government is receiving criticism for not requiring major platforms to participate in its age assurance trials.
The upcoming codes aim to prevent young children from encountering inappropriate content and to empower Australian internet users with tools to manage their exposure. While the primary focus is on pornography, the codes will also address other high-impact content, including themes of suicide, self-harm, and serious illness.
The industry is expected to present a preliminary draft of the codes by October 3, 2024, with final codes due by December 19, 2024. A public consultation will also be held on the proposed codes.
To assist the industry, the eSafety commissioner has published a position paper outlining expectations for child protection measures. Additionally, an age assurance tech trends paper has been released to provide context on recent developments in age verification technology. The trends paper sets out the experience of online platforms and other countries with various methods of age assurance, including biometric facial age estimation, and other market developments, including NIST’s evaluation of age estimation algorithms.
The codes will be enforceable and apply across various digital platforms, including app stores, websites (including pornographic sites), search engines, social media, hosting services, internet service providers, messaging apps, multi-player gaming, online dating services, and equipment providers.
Julie Inman Grant, the eSafety commissioner, highlights the pervasive and invasive nature of online pornography, which often reaches children unintentionally. “Our research indicates that the average age at which Australian children first encounter pornography is around 13, with a third encountering it at even younger ages, often by accident,” she adds.
“Sixty percent of young people reported encountering pornography on social media platforms like TikTok, Instagram, and Snapchat.”
These new codes will complement existing protections under the Online Safety Act, such as the Restricted Access System Declaration and the Basic Online Safety Expectations Determination. They will also align with broader government efforts, including the Age Assurance Trial, Privacy Act reforms, and initiatives under the National Plan to End Violence Against Women and Children 2022-2032.
Despite these efforts, social media platforms like Meta and TikTok will not be required to participate in the government’s trial of age assurance technologies, raising concerns about the trial’s efficacy, InnovationAus reports.
The eSafety commissioner has warned that should the industry fail to develop satisfactory codes, she will impose her own standards, as was done for abhorrent online content.
Last year, the Albanese government in Australia decided against implementing a mandatory age verification system for online pornography and other adult content, citing the underdeveloped state of existing technology solutions.
Article Topics
age verification | Australia | biometrics | children | face biometrics | legislation | Online Safety Act (Australia) | social media | standards
Comments