Two months in, Snapchat is still not a fan of Australia’s social media law

As Australia daily moves further into its era of age restrictions on large social media platforms, various stakeholders are offering thoughts on how the law has aged so far.
The latest is Snapchat, which has published a blog noting that, while the company remains committed to complying with the Social Media Minimum Age (SMMA) law, the past two months have provided it with “important insights about the potential limitations of this law as it currently stands.”
Snapchat says that, as of the end of January 2026, it has locked or disabled over 415,000 Snapchat accounts in Australia “belonging to users who either declared an age under 16 or who we believe to be under 16 based on our age detection technology. We continue to lock more accounts daily.”
And yet. Being a massive social media platform of exactly the sort Australia’s law is tailored to regulate, Snapchat is not convinced the new rules are working.
Its arguments, however, fall into two by-now easily recognizable fallacious troughs.
The first is that facial age estimation (FAE) – increasingly the most popular option for low-stakes age assurance – is only accurate to within 2-3 years on average. “In practice, this means some young people under 16 may be able to bypass protections, potentially leaving them with reduced safeguards, while others over 16 may incorrectly lose access.”
FAE providers have been adamant in explaining the need for biometric estimation systems to factor in a buffer zone, to account for exactly this discrepancy. Framing it as a disqualifying flaw misses the semantic mark on “estimation.”
The second is the fearmongering notion that, cut off from the safe haven of Snapchat, kids might turn to an alternative, unregulated and surely sinister massive social media network, where they would be subject to all manner of risks. This is an argument borrowed from the adult content industry, which has seen traffic to compliant sites collapse as users flock to smaller, unregulated porn sites. But for social media titans it is, proverbially, hogswallop, in that social media sites’ appeal (and risk) lies in mass adoption – a demonstrably difficult thing to sustain for new contenders in the market.
“While we don’t yet have data to quantify this shift,” Snapchat says, more or less admitting it’s pure speculation, “it’s a risk that deserves serious consideration as policymakers evaluate whether the law is achieving its intended outcomes.”
On the count of three, everyone point to the app stores
Also deserving of consideration, of course, is putting the responsibility of age checks on another company. Like Meta, Snap says it is so committed to keeping young kids safe that it has thought of the best possible solution: “app store-level age verification as an additional safeguard to bolster the SMMA’s implementation in a way that is less likely to have negative unintended consequences.”
Making Google and Apple do age checks, says Snap, would “strengthen safety across the entire digital ecosystem,” and create “a more universal foundation for age assurance.
“Rather than blanket age-based social media bans, app store-level age assurance could help the entire ecosystem protect young users more consistently and deliver developmentally appropriate experiences while allowing them to enjoy the benefits of social media.”
Snap has argued throughout the Australian legislative process that it shouldn’t be affected by the prohibition, because it is primarily a messaging app rather than a content stream like Instagram or X. Its position has not changed.
“We want to be clear,” it says. “We fundamentally disagree that Snapchat is an in-scope age-restricted social media platform.”
Meanwhile, in Australia, Snap is employing k-ID to facilitate age assurance via ID document validation or biometrics, and ConnectID for bank network-based age checks.
The present strategy for social platforms, then, is to proclaim loudly that they are obeying the law while also loudly complaining about it, and trying to shift the responsibility and liability for age checks off the platform level.
“Despite our disagreement with the policy itself, we believe it’s important to engage constructively and suggest ways to improve its implementation and reduce negative unintended effects,” Snap says. “Creating a centralized verification system at the app-store level would allow for more consistent protection and higher barriers to circumventing the law.”
“In the meantime, we continue building safety protections that will keep young Snapchatters safe in Australia and around the world.”
Beware the temptations of Lemonade, RedNote and Yope
Canada is also exploring the idea of imposing age regulations on social media. The CBC’s daily morning news show has coverage of Australia’s legislation, looking at how the model might apply in a local context.
In an interview, Lisa Given, professor of information sciences at the Royal Melbourne Institute of Technology (RMIT), gives Australia’s record a middling grade, saying it’s been a mixed bag in terms of success.
To her credit, in suggesting that kids are migrating from major social platforms to other, less widely known sites, she names names, citing Lemonade, RedNote (AKA Xiaohongshu) and Yope as emerging contenders – “things many people may not have heard of.”
“It means the government has to be continually chasing new technologies, which is almost an unending fiction. It’s a game of Whac-a-Mole.”
In the end, however, it may not be the obscure social alternatives that pose the biggest threat, but established spaces not encompassed in the scope of the law – platforms like “4chan, or Truth Social, where we know that there’s content that’s really inappropriate for young people.”
For Canada, Given says the takeaway is “we really need something that will hold tech companies to account. And I think the general public really has to put that in front of legislators.”
Canada reportedly plans to reintroduce its moribund Online Harms Bill in the coming months, and it is expected to include measures affecting social media. However, as is noted by Taylor Owen, founding director of the Centre for Media, Technology and Democracy at McGill University, “in Canada, we don’t actually have a regulatory body or an enforcement body that could mandate companies to do it.”
Article Topics
age verification | app store age verification | Australia age verification | biometric age estimation | connectID | facial age estimation (FAE) | k-ID | Snapchat | social media







Comments