FB pixel

Australian regulators come together on privacy, online safety

OAIC, eSafety sign memorandum to work together on risks from AI, gaming
Categories Age Assurance  |  Biometrics News
Australian regulators come together on privacy, online safety
 

The relationship between various regulatory bodies across the privacy and online safety spectrum can be difficult to parse. Australia’s two major digital regulators, eSafety and the Office of the Australian Information Commissioner (OAIC), are simplifying things by signing a memorandum of understanding (MoU) on working together to protect privacy and safety online.

The MoU aims to “guide and facilitate the parties’ collaboration, cooperation and mutual assistance in the performance of their respective statutory functions, and provide transparency about the parties’ efforts to coordinate activities and minimize duplication.” Under the terms, the parties will designate liaison contact officers to facilitate communication and exchange of information.

Generally, the document is a promise to work together in harmony on issues pertaining to the Privacy Act, the Online Safety Act, and the topics they address – including biometric data collection and age assurance requirements under the Social Media Minimum Age obligation.

“Both regulators have always recognized that combating certain harms requires privacy and safety to go hand in hand,” says esSafety Commissioner Julie Inman Grant. “For example, at eSafety we knew from the outset our implementation of the Social Media Minimum Age would need to recognize important rights, including the right to privacy. Our commitment to continue working collaboratively with the OAIC gives formal recognition to that principle and sets out how we will balance and promote privacy and safety for everyone.”

Inman Grant says the collaboration is timely, given new risks emerging with large language models (LLMs) and other AI technologies.

Australian Information Commissioner Elizabeth Tydd says that, with the MoU, “we’re not only formalizing cooperation, but building a foundation where privacy protections and online safety initiatives can better address specific harms side by side, ensuring Australians can be protected when interacting online.”

Four gaming platforms get transparency notices from eSafety

High on the list of issues for the newly-paired agencies to address is the problem of grooming, sexual exploitation and radicalization on online gaming platforms. A release from eSafety says it has handed “legally enforceable transparency notices” to Roblox, Minecraft, Fortnite and Steam, “amid concerns online games are being used by sexual predators to groom children and by extremist groups to spread violent propaganda and radicalize young people.”

Most Australian kids use one or more of these platforms. According to research by eSafety, around 9 in 10 children aged 8 to 17 in Australia play or have played online games. As such, the commissioner wants to know what these platforms are doing to identify and prevent harms, and asks how their systems, staffing and design choices are aligned with the Australian Government’s Basic Online Safety Expectations.

“Gaming platforms are amongst the online spaces most heavily used by Australian children, functioning not only as places to play, but also as places to socialize and communicate,” says eSafety head Julie Inman Grant. “Predatory adults know this and target children through grooming or embedding terrorist and violent extremist narratives in gameplay, increasing the risks of contact offending, radicalization and other off-platform harms.”

Because these platforms allow users to craft and share their own games, content can be created to normalize atrocities: for instance, gamifying the operation of a concentration camp, or the January 6 Capitol Building riot in the U.S.

“We’ve seen numerous media reports about grooming taking place on all four of these platforms as well as terrorist and violent extremist-themed gameplay. This includes Islamic State-inspired games and recreations of mass shootings on Roblox, as well as far right groups recreating fascist imagery in Minecraft.”

“These companies must take meaningful steps to prevent their services becoming onramps to abuse, extremist violence, radicalization or lifelong harm.” Per the release, a breach of a direction to comply with a code or standard can result in penalties of up to $49.5 million (roughly US$35.5 million) per breach, and failing to respond to a transparency reporting notice can lead to penalties of up to $825,000 (about US$590,000) a day.

Of the four platforms in eSafety’s sights, Roblox has gotten the worst press and the most legal scrutiny. This week, it agreed to pay a combined US$35.8 million to settle child online safety cases with the attorneys general of Nevada, Alabama and West Virginia.

It also has Australia’s attention. Under the Online Safety Codes and Standards, Roblox “committed to make a number of key changes earlier this year to protect children including more stringent age assurance, making accounts belonging to under 16s private by default, and introducing tools to prevent adult users from contacting under 16s without parental consent.” Testing on the implementation of these commitments will “validate their effectiveness.”

Canadian government worried Roblox is radicalizing youth 

Roblox recently launched new age-tiered accounts, and has regularly pledged to be a leader on online safety. Despite its efforts, concerns continue to rise over how adults are using gaming sites to lure children. The Logic has a report on a Public Safety Canada brief obtained through a freedom of information request, which singles out Roblox for being “of particular relevance as an entry point where vulnerable children and youth are targeted by malicious actors.”

Its unique combination of social interaction, user generated content and young user base mean “Roblox may impact youth radicalization in unexpected ways.”

Canada is considering a social media age restriction and attendant age verification rules similar to Australia’s. Culture Minister Marc Miller, who is expected to table online safety legislation this year, says “the gaming industry is different than other platforms, and the more that they become sort of social media-ish, the more they expose themselves to responsibility and potentially regulation.”

Related Posts

Article Topics

 |   |   |   |   |   |   | 

Latest Biometrics News

 

Signicat digital ID, wallet hub aims to ease ‘organized chaos’ of EU transition

The EU is undergoing large-scale digital transformation, and much of it hinges on two major, overlapping regulations coming into force…

 

Search for clarity on UK digital ID leads to trial pitch, consultation proposals

Confusion has covered the UK’s digital identity plans like a morning fog over London. How the UK’s digital ID system…

 

Bank of Thailand: missing piece in the country’s digital ID stack is data portability

Thailand has a healthy national digital ID platform alongside a real-time payments system that’s woven into everyday life. But according…

 

Veriff gets FIDO DocAuth certification based on tests by Ingenium Biometrics

Veriff has successfully achieved the FIDO Alliance Document Authenticity (DocAuth) certification for its full-auto identity verification product. A release from…

 

Netherlands launches ICAO-compliant vertical ID card from IN Groupe

The Dutch Immigration and Naturalization Service (IND) has introduced a new vertically oriented identity document for foreign nationals awaiting asylum…

 

French gov’t confirms hack of at least 18M records from ID document database

France’s government has confirmed that a database storing records of identity documents suffered a breach of millions of entries containing…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events