Big platforms pout over Australia’s social media law but pledge to comply

Any parent understands that, in some cases, no matter how hard a child protests the rules, they’ll probably give in eventually. So it goes for giant social media platforms in the case of Australia’s incoming law prohibiting kids under 16 from creating social media accounts.
Meta, Snap and TikTok have all committed to complying with the law – although all say it will be hard to enforce.
France 24 quotes TikTok’s Australia policy lead Ella Woods-Joyce, who told a Senate hearing that “TikTok will comply with the law and meet our legislative obligations,” but noted that “experts believe a ban will push younger people into darker corners of the Internet where protections don’t exist.”
Meta, which owns Instagram and Facebook, likewise says it will do its best to remove all Australian users under 16 from its platforms – “the goal from our perspective being compliance with the law,” in the words of Meta policy director Mia Garlick. But, she says, it will be an “enormous challenge.”
InnovationAus quotes Garlick as saying the company will use a “waterfall approach” to age checks, with tiers of age verification depending on risk factors, but that the company is still working through the “precise machinations.”
Social media borrows argument from porn, but it doesn’t really apply
Time has proven that the platforms will argue against Australia’s law however they can. Snapchat and YouTube have agitated for exemptions, saying their services aren’t primarily “social media.” TikTok casts doubt on the pace of the process, saying it’s tough to comply with such “novel legislation” given the timelines – even though the law was approved almost a year ago, and does not come into effect until December 10.
The idea that regulations will just force people into the shadowy netherworlds of the dark web is familiar from arguments the adult content industry deploys to push back against age verification laws for pornography. But it overstates the problem, particularly in the case of social media, which depends on a critical mass of users. Joining the platforms that have amassed the necessary numbers to appeal to the average person is not a simple task – just ask Mastodon, Vine or Yik Yak.
Besides which, the would-be realm of seedy, unregulated social media doesn’t really exist, at least not in the same way that child sexual abuse material or violent pornography exist. The whole point of legislating age checks for social media is that society has realized these specific, widely used social platforms, which have dominated our lives for 20 years, are in themselves causing harm.
There’s no Evilgram where kids will go to post selfies and find their social lives destroyed; that’s just Instagram. There’s no secret social network for aspiring mad geniuses with Nazi leanings; that’s X. Even concerns about grooming on dark web sites are mooted by how much of it happens not on iffy message boards, but on Roblox – one of the world’s biggest digital platforms.
To paraphrase a popular urban legend, there is no boogeyman out there in the dark; the posts are coming from inside the house.
Regulators know this. Australian eSafety Commissioner Julie Inman-Grant has warned all the big players that they are almost certain to be covered by the law. She has rejected YouTube’s call to reinstate a carve-out for the video-sharing platforms. And she is likely to proceed with enforcement and fines for those who don’t pass muster after the December 10 deadline, as the world watches to see whether or not Australia can pull off its world-first attempt to keep kids off social media.
Article Topics
age verification | Australia | Australia age verification | biometric age estimation | Meta | regulation | Snapchat | social media | TikTok







Comments