Australian eSafety Commissioner announces platforms covered by social media law

Comments made this week at a press conference held by Australia’s Minister for Communications and eSafety Commissioner aim to bring clarity to parents and children about the country’s much-ballyhooed online safety law, which puts age verification requirements on social media platforms. The clarity could not come at a better time, as some question why certain platforms are covered by the law and others – notably social gaming platform Roblox – are not.
According to a release, “eSafety has informed Facebook, Instagram, Snapchat, Threads, TikTok, X, YouTube, Kick and Reddit of its view they are age-restricted platforms required to comply with Social Media Minimum Age restrictions from December 10.”
Communications minister Anika Wells says “this means from 10 December, these services must take reasonable steps to prevent under-16s from holding accounts, and failure to do so could warrant fines of up to 49.5 million dollars.” (About 32 million U.S. dollars.) These include biometric age assurance tools such as age verification and facial age estimation services.
Wells says that, while eSafety Commissioner Julie Inman Grant has pledged to keep a “dynamic” list that factors in changes to platforms and emergent technology, “we understand that families need certainty now, regarding the major platforms captured under the social media minimum age laws.”
“We have met with several of the social media platforms in the past month so that they understand there is no excuse for failure to implement this law.”
Gaming, messaging, educational platforms left off list – for now
Inman Grant says the assessments “are purely based on the criteria set out in the legislation, including the sole or significant purpose of the platform to enable online social interaction,” outside the other key consideration: “whether the platform meets the criteria of a class of services where there is an exception, such as for messaging, online gaming, or educational content.”
“This was not a compare and contrasting exercise. In order to be consistent and fair, we assessed each platform or service on its merits against the criteria in the legislative rules.”
Platforms assessed but not covered by the law include Discord, GitHub, LEGO Play, Roblox, Steam and Steam Chat, Google Classroom, Messenger, WhatsApp and YouTube Kids, which
currently do not meet the criteria for “age-restricted social media platform.” However, Inman Grant is urging firms to self-assess regularly, and in particular when new features are added, to ensure they don’t tip into qualifying without realizing it.
Comments from the two Australian leaders at times seem to frame the law as a grand experiment – which, in some ways, it is, in that Australia is the first to try it, and other nations are watching to see how it goes. However, while the crackdown on addictive and harmful design features is warranted, there is a hint of nostalgic fantasy in how it has been branded.
Inman Grant says how the law impacts young people will be “felt over a much longer period of time,” and that the government is working with academics to evaluate the impacts. “Are kids sleeping more? Are they out in the footy fields? Are they interacting more?”
“Delaying children’s access to social media accounts gives them valuable time to learn and grow, free of the powerful, unseen forces of harmful and deceptive design features such as opaque algorithms and endless scroll.”
This seems more like wishful thinking than a likely outcome. As Inman Grant says, time will tell. Yet the law is already beginning to show holes. There are the usual issues regarding illegal access through VPNs, which Wells brushes off with a trite, “kids will be kids.” But a bigger concern is Roblox, which eSafety has assessed does not qualify, despite widespread concerns about grooming, bullying and sexual content on the site.
Roblox not covered – even though it might be most dangerous
Inman Grant has defended the decision to exclude Roblox from the legislation, saying that just because this particular law doesn’t apply, it doesn’t give Roblox carte blanche to host sex predators. “We’re using other tools in our arsenal to keep these other platforms safer,” she says, noting that “just because a platform is exempted through the legislative rules doesn’t mean it’s safe.” She points to Roblox’ commitment to implement age assurance technology as evidence that, even if it doesn’t have to sweat the December 10 enforcement date, it’s working on bringing its service into alignment with the country’s codes and standards.
The overarching message is that the rules are here, and everybody had better be on their toes in terms of compliance. The government has released a variety of resources to support sites in understanding the law, and Inman Grant has urged platforms to be stringent in their self-assessments. Because she’s done with arguments from Big Tech that effective age assurance is beyond their technical purview – which tend to change conveniently when regulatory pressure gets too heavy.
“We’ve had deep conversations with companies about this,” she says. “I was in Silicon Valley talking to all of the major platforms. They’re already doing this. They know they can do this.
All of a sudden, it’s ‘oh, we can target using our age inference technologies we’ve already been using for a decade.”
“In the end, when we are confident that the companies have indeed complied, we’ll probably see some great PR about all the innovations they were able to achieve to this end.”
Not everyone, however, will find the bright side of compliance; some will simply lie. Witness OpenAI, which, when Inman Grant met with them, conveniently neglected to tell her that the following week would see the release of Sora, its generative AI video engine – despite them branding it as a social app. The commissioner has followed up by sending them a self-assessment tool.
“Obviously with social media, we’re trying to remediate the harms that were created more than 20 years ago,” she says, underlining the amount of time that social media overlords have been allowed to run amok without regulation. “We also have to keep a watchful gaze or an eye on the future and look at how all of these technologies are converging. Because I happen to believe that if the proper guardrails aren’t put around AI, they could bring much greater harms.”
Article Topics
age verification | Australia | Australia age verification | biometric age estimation | children | eSafety Commissioner | social media







Comments