Published Date : 9/4/2025Â
Roblox is rolling out biometric age estimation, provided by Persona, for all users who want to access its communication features. An announcement from the social gaming platform, which has been flagged as a favorite haunt for groomers, states that its solution will combine facial age estimation technology, ID age verification, and verified parental consent to “provide a more accurate measure of a user’s age than simply relying on what someone types in when they create an account.”
With this information, Roblox will also launch new systems designed to limit communication between adults and minors unless they know each other in the real world. These added layers of protection will help provide users with access to developmentally appropriate features and content.
Roblox is also partnering with the International Age Rating Coalition (IARC), which will replace Roblox’s own content ratings for games and apps on the platform with jurisdictional ratings systems worldwide. For example, players in the Republic of Korea will see ratings from GRAC; players in Germany will see ratings from the USK; and players elsewhere in Europe and the United Kingdom will see ratings from PEGI. The new age assurance system will be in place by the end of 2025.
The company hopes to set a standard that other gaming, social media, and communication platforms follow. “We expect that our approach to communication safety will become best practice for other online platforms, whether lawmakers pass laws requiring age verification for all platforms in the future or not.”
The company’s insistent tone, and the wave of other safety measures it has rolled out in recent months, is likely driven in part by legal trouble. Just this week, a mother in Oklahoma sued the company for failing to protect her 12-year-old daughter from sextortion, and a Michigan attorney filed suit on behalf of an adult who claims Roblox enabled a predator to groom, assault, and blackmail her as a child. These follow a lawsuit from the State of Louisiana, claiming that “Roblox is overrun with harmful content and child predators because it prioritizes user growth, revenue, and profits over child safety.”
In a recent vlog, David Baszucki, founder and CEO of Roblox, offers an update on some of the measures Roblox has adopted to protect its roughly 80 million daily active users and its reputation. The company says it has shipped over 100 safety initiatives since January 2025, including its Trusted Connections feature, which requires users who are 13 and over to complete facial age estimation to “have more authentic conversations” with connections they know in real life. (The video features a fascinating breakdown of how, when, and why it’s OK for kids to call each other a butthead.)
It has also implemented Roblox Sentinel, an open-source AI system that helps detect early signals of child endangerment. According to Matt Kaufman, Roblox’s chief safety officer, “the model is looking for long-term behavior patterns which result in violations of our policies.”
In addition, it has improved its open-source voice filters with a model that directly analyzes voice communication, rather than text generated from speech, to capture intonation and other audio factors. And it has rolled out new technology designed to detect servers where a large number of users are breaking its rules in experiences that are otherwise innocuous, and take them down.
“All these initiatives build on the layered safety systems that already exist on Roblox,” says the blog. “Unlike many other online platforms, Roblox proactively monitors all text chat on the platform, prevents user-to-user image sharing, and has default settings designed to prevent users younger than 13 from using private chat or voice chat. We also filter public chat to block inappropriate content. Roblox provides parental controls so families can customize default settings to what they feel is best for their child.”
Roblox knows there are bad actors and questionable operators on its platform. Baszucki’s panel touches on the issues of aggressive vigilanteism, deepfakes, and users looking to move conversations off-platform to sites with fewer security measures.
In implementing facial age estimation as the anchor for its online safety mechanisms, Roblox aims not just to address these issues but to leave its legal woes behind and take a leadership role in the age check sector.Â
Q: What new safety measures is Roblox implementing?
A: Roblox is implementing biometric age estimation, AI voice screening, and partnering with IARC for jurisdictional content ratings to enhance user safety, especially for minors.
Q: How does the facial age estimation work?
A: Facial age estimation uses technology from Persona to provide a more accurate measure of a user’s age by analyzing facial features, combined with ID age verification and verified parental consent.
Q: What is the Trusted Connections feature?
A: The Trusted Connections feature requires users who are 13 and over to complete facial age estimation to ensure they are having authentic conversations with connections they know in real life.
Q: How does Roblox Sentinel help in child safety?
A: Roblox Sentinel is an open-source AI system that detects early signals of child endangerment by monitoring long-term behavior patterns that result in violations of Roblox’s policies.
Q: What parental controls does Roblox offer?
A: Roblox provides parental controls that allow families to customize default settings to what they feel is best for their child, including restrictions on private chat and voice chat for users under 13.Â