FB pixel

Self-assess ahead of social media ban, says Australian eSafety Commissioner

Platforms like Lego Play, GitHub urged to consult guidelines as X pushes back
Categories Age Assurance  |  Biometrics News
Self-assess ahead of social media ban, says Australian eSafety Commissioner
 

Australia’s eSafety Commissioner is urging large online platforms to figure out if they’re affected by the nation’s incoming age restrictions on social media, which ban users under 16 from having accounts.

While big names like Facebook, Instagram and X are clearly implicated, there has been less clarity on the status of sites that are not strictly for posting. YouTube was given an exemption – which was then revoked, prompting an angry response from the video sharing site. Gaming sites, dating sites and messaging apps could all find themselves subject to the law.

Julie Inman-Grant, the eSafety commissioner, says they shouldn’t wait to find out. Having issued guidance to help social media platforms meet their regulatory obligations, she has written to 16 platforms encouraging them to self-assess ahead of the ban coming into effect on December 10. According to News.com.au, the list goes far beyond what’s normally thought of as social media, including gaming platforms Steam, Roblox and Lego Play; streaming sites Twitch and Kick; messaging platforms Discord and HubApp; online dating conglomerate Match Group; and code repository GitHub.

More traditional social platforms include those owned by Meta (Facebook, Instagram and WhatsApp), as well as X, Snap, TikTok and Pinterest. The eSafety commissioner’s office says these “meet many of the conditions” for age restrictions.

The news report quotes a spokesperson for eSafety, declaring that “any platform eSafety believes to be age-restricted will be expected to comply and eSafety will make this clear to the relevant platforms in due course. eSafety will also provide information to the public about platform self-assessments and its own view in the lead up to December 10.”

Any exemptions will also be announced before the deadline.

Self-assessment may keep some platforms out of hot water – but the government’s recent Age Assurance Technology Trial (AATT) identified the risks in overzealous self-policing.

X petitions to delay ban, saying it isn’t ready

The hammer won’t fall on the tenth of December, if X has anything to say about it. Elon Musk’s Jokerized iteration of Twitter is seeking to delay the social media ban, according to an article from InnovationAus.

The platform says it was not given a chance to provide feedback on the “reasonable steps”  platforms must take to block access to users under 16, complaining that it was given “mere weeks to interpret, plan, and deploy compliance measures under the threat of substantial penalties, exacerbating risks of incomplete implementation, higher costs, and potential inconsistencies across platforms.”

The idea that X didn’t see Australia’s ban coming assumes the company has missed the last five years or so of news and legislative activity. One might argue that Musk, who purchased and rebranded Twitter in 2022, has been busy working as a public servant, making big speeches and training his AI, Grok, to model itself on great orators of history.

One might also note that Musk dislikes regulation as a rule, and that he has fashioned X as a beacon of free speech, no matter how nasty it gets. Indeed, the latest news from Business Insider says Grok’s “spicy” mode has been generating child sexual abuse material (CSAM).

The company’s last-minute call to delay the ban by six months rests on supposed “serious concerns” that the ban may be unlawful and incompatible with Australia’s Privacy Act and international rights commitments. It also says that it has not had enough time to implement age assurance – a statement that flies in the face of the results of the AATT, which identified a number of market-ready tools for age verification and facial age estimation.

There are others who still insist the tech is not sound enough for primetime. The Australian Research Council Centre of Excellence for the Digital Child says age estimation algorithms are “not close to successfully distinguishing between people who are under and over 16 years old”.

“While it remains a common refrain in computer science that such systems simply require better training data, more sophisticated algorithms or other incremental improvement, research analysis has shown that age estimation solutions from facial scans demonstrate that automated processes such as these cannot ever be expected to achieve acceptable levels of accuracy.”

The industry’s recommendation is to work with a buffer age, typically of two or three years.

Advisory group formed to help steer, evaluate social media law

Objections are inevitable to any new law that imposes additional responsibilities on free market titans. Perhaps anticipating the need for wise counsel, the eSafety Commissioner has appointed an academic advisory group of eleven distinguished experts to support what a release calls “a robust and transparent evaluation of the implementation and outcomes of the Social Media Minimum Age (SMMA) obligation.”

The advisory group, led by Stanford University’s Social Media Lab, will support the eSafety Commissioner in examining how the SMMA obligation is being implemented and “evaluate its short and medium-term impacts on children, young people, and their parents or caregivers.” Its role is to provide independent, evidence-based guidance throughout the evaluation’s design, analysis, and implementation phases.

The counsel and its findings will serve as a “data source informing the future independent review of the legislation.”

The eSafety commissioner, Julie Inman-Grant, says that “collectively, the group brings unique and specialised experience in adolescent mental health, child and adolescent development, family dynamics in digital media use, and the lived experiences of children and parents in online environments. Their specialisations include children’s digital rights, online harms, and the digital lives of Aboriginal and Torres Strait Islander communities. The team also brings deep knowledge in psychiatric epidemiology, digital parenting, and the complex relationship between social media and wellbeing.”

“The inclusion of both domestic and international experts from a range of complementary disciplines ensures the evaluation is informed by a rich diversity of perspectives and enhances the relevance and global impact of its findings.”

Notably, no one appointed to the counsel is receiving remuneration for participating.

Social media companies must prove compliance to UK regulator next week

The UK is also approaching a key age assurance deadline. Tuesday, September 30 is the deadline for the world’s biggest social media platforms – including Facebook, Instagram, Roblox, TikTok and YouTube – to prove to regulator Ofcom they are in compliance with new  online safety laws.

Andy Lulham, online safety expert and COO at UK age assurance provider Verifymy, calls Ofcom’s decision to focus on the platforms most popular with children “a sensible approach that will lead to widespread protection.”

“Age checks are the easiest way for social media platforms to know their users and create age-appropriate experiences that go beyond shielding children from adult or harmful content. Some social media companies have done a great job of creating tailored experiences for young people, but it’s all for nought if users can still misreport their age.”

Lulham also repeats what is becoming something of a mantra in age assurance circles: the technology is useful, but it’s not “a silver bullet. “Platforms and regulators must continue to engage with educators, parents, young people and each other to ensure children are well-equipped to navigate the digital world safely.”

Related Posts

Article Topics

 |   |   |   |   |   |   |   |   | 

Latest Biometrics News

 

Spain’s AEPD fines Yoti $1.1M for biometric data handling violations

Yoti has been fined 950,000 euros (roughly US$1.1 million) by Spanish data protection regulator AEPD for the handling of biometrics…

 

Scottish Biometrics Commissioner reviews fingerprinting in policing

Police in Scotland may be failing to fingerprint more than 12,000 a year due to inadequate operational practices and a…

 

UK Lords reject bid to block police facial recognition searches of DVLA database

The UK’s House of Lords has voted down an attempt to prevent the Driver and Vehicle Licensing Agency (DVLA) database…

 

UK gov’t to design and build national digital ID in-house

The UK government plans to design, build and run its digital ID in-house, rather than outsourcing it to a private-sector…

 

UK MPs reject under-16 social media ban, but leave room for potential sanctions

UK lawmakers have rejected a ban on social media for under-16s, but the depth of feeling on the subject means…

 

US Treasury’s crypto playbook puts digital identity at the center

The U.S. Department of Treasury’s just released report to Congress on innovative technologies to counter illicit finance involving digital assets…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events