Roblox to make age assurance for chat mandatory as of January 2026

Roblox is trying. In the face of mounting criticism of the social gaming platform’s failure to protect kids from grooming and other harms, Roblox has announced that it will now require age checks for communications between users, with facial age estimation (FAE) as a biometric option.
“Roblox is the first online gaming or communication platform to require facial age checks to access chat, establishing what we believe will become a new industry standard,” says a post on Roblox’s blog. “This innovation supports age-based chat and limits communication between minors and adults. Once the age check is complete, users will only be allowed to chat with others in similar age groups, unless they become Trusted Connections with people they know.”
“Age checks are completely optional; however, features like chat will not be accessible unless the age check is complete.”
Roblox’s communications have boasted of its leadership in the age assurance space, partly as a pushback to a wave of lawsuits accusing it of creating (in language taken from a suit launched by the state of Louisiana) “an environment where sexual predators thrive, unite, hunt and victimize kids. Kentucky and Texas have also sued on similar grounds. Ken Paxton, Texas’ attorney general, calls Roblox a “digital playground for predators where the well-being of our kids is sacrificed on the altar of corporate greed.
The platform has been using Persona for its existing biometric age verification protocols since 2023, when the idea was to separate adult users from kids. In July 2025, it rolled out facial age estimation (FAE) jointly provided by Persona and Paravision, as part of its Trusted Connections feature, which enables interactions between vetted accounts. In September, it expanded the program and added Roblox Sentinel, an open-source AI system that helps detect early signals of child endangerment, and improved its open source voice filters to capture intonation and other details.
Roblox is phasing the rollout of its age assurance system, starting with a voluntary age check period that has already begun. It will become mandatory in select markets in December, including Australia (where the much-ballyhooed social media ban for under-16s goes into effect on December 10), New Zealand and the Netherlands. A global rollout will begin in January 2026.
The company says the FAE system is designed for minimal data retention, deleting images and video immediately after processing. For maximum safety, chat in experiences will be turned to default off for users under nine years old, unless a parent provides consent after an age check. Chat outside of experiences (in the platform’s mall, if you will) is restricted for users under 13.
“After users complete the age check process, we will inform them of their assigned age group: Under 9, 9-12, 13-15, 16-17, 18-20, or 21+. Users will be able to chat with those in their own age group and similar age groups, as appropriate,” Roblox says. So if Sofia is estimated to be 12, no one over 15 will be allowed to interact with her account, unless they are a Trusted Connection.
In January, Roblox will also require age checks to access social media links on user profiles, communities and experience details pages.
Roblox continues to insist that its efforts to combat age-inappropriate interactions are a light in the digital darkness. The platform notes its algorithmic voice and text monitoring system, age-based chat filtering, strict media sharing policy and restrictions on links. It touts its parental controls: “for parents of teens, we offer tools for transparency, allowing them to easily view who their teen is connecting with.”
Can facial age estimation make Roblox once again safe for kids of all ages? Success will be measured in additional lawsuits.
Article Topics
age verification | biometric age estimation | children | facial age estimation (FAE) | Paravision | Persona | Roblox







Comments