Roblox dials up safety measures with Persona age estimation, AI voice screening

Roblox is rolling out biometric age estimation, provided by Persona, for all users who want to access its communication features. An announcement from the social gaming platform, which has been flagged as a favorite haunt for groomers, says that its solution will combine facial age estimation technology, ID age verification and verified parental consent, to “provide a more accurate measure of a user’s age than simply relying on what someone types in when they create an account.”
“With this information, we’ll also launch new systems designed to limit communication between adults and minors unless they know each other in the real world. These added layers of protection will help provide users with access to developmentally appropriate features and content.”
Roblox is also partnering with the International Age Rating Coalition (IARC), which will replace Roblox’s own content ratings for games and apps on the platform with jurisdictional ratings systems worldwide. TechCrunch explains: “players in the Republic of Korea will see ratings from GRAC; players in Germany will see ratings from the USK; and players elsewhere in Europe and the United Kingdom will see ratings from the PEGI, for instance.”
The new age assurance system will be in place by the end of 2025. Roblox says it hopes to set a standard that other gaming, social media and communication platforms follow. “We expect that our approach to communication safety will become best practice for other online platforms, whether lawmakers pass laws requiring age verification for all platforms in the future or not.”
The company’s insistent tone, and the wave of other safety measures it has rolled out in recent months, is likely driven in part by legal trouble. Just this week, a mother in Oklahoma sued the company for failing to protect her 12-year-old daughter from sextortion, and a Michigan attorney filed suit on behalf of an adult who claims Roblox enabled a predator to groom, assault and blackmail her as a child. These follow a lawsuit from the State of Louisiana, claiming that “Roblox is overrun with harmful content and child predators because it prioritizes user growth, revenue, and profits over child safety.”
In a recent vlog, David Baszucki, founder and CEO of Roblox, offers an update on some of the measures Roblox has adopted in order to protect its roughly 80 million daily active users – and its reputation. The company says it has shipped over 100 safety initiatives since January 2025, including its Trusted Connections feature, which requires users who are 13 and over to complete facial age estimation to “have more authentic conversations” with connections that they know in real life. (The video features a fascinating breakdown of how, when and why it’s OK for kids to call each other a butthead.)
It has also implemented Roblox Sentinel, an open-source AI system that helps detect early signals of child endangerment. Per Matt Kaufman, Roblox’s chief safety officer, “the model is looking for long-term behavior patterns which result in violations of our policies.”
In addition, it has improved its open source voice filters with a model that directly analyzes voice communication, rather than text generated from speech, to capture intonation and other audio factors. And it has rolled out “new technology designed to detect servers where a large number of users are breaking our rules in experiences that are otherwise innocuous, and take them down.”
“All these initiatives build on the layered safety systems that already exist on Roblox,” says the blog. “Unlike many other online platforms, Roblox proactively monitors all text chat on the platform, prevents user-to-user image sharing, and has default settings designed to prevent users younger than 13 from using private chat or voice chat. We also filter public chat to block inappropriate content. Roblox provides parental controls so families can customize default settings to what they feel is best for their child.”
Roblox knows there are bad actors and questionable operators on its platform. Baszucki’s panel touches on the issues of aggressive vigilanteism, deepfakes, and users looking to move conversations off-platform to sites with fewer security measures.
In implementing facial age estimation as the anchor for its online safety mechanisms, it is aiming not just to address these issues – but to leave its legal woes behind in a bid for leadership in the age check sector.
Article Topics
age verification | children | face biometrics | facial age estimation (FAE) | gaming | ID verification | Persona | Roblox | social media






Comments