Home News Roblox Unveils Age-Detecting AI From Selfie Video

Roblox Unveils Age-Detecting AI From Selfie Video

Author : Charlotte Jan 08,2026

Roblox has unveiled a new suite of safety features specifically designed for teenagers aged 13 to 17. This includes a novel age estimation technology that utilizes AI to assess a user's age based on a submitted video selfie.

The announcement details several new measures aimed at enhancing safety for teens and children on the platform. Central to this update are features that grant teens (13-17) more autonomy than younger users, while still maintaining more restrictions than adult accounts. A key feature allows teens to designate "trusted connections" with whom they can communicate on Roblox without chat filters. According to Roblox, this initiative aims to keep conversations within the monitored platform environment, rather than driving teens to less secure third-party apps where inappropriate discussions could occur.

Trusted connections are meant for users who know each other in real life. If a teen wishes to add someone 18 or older as a trusted connection, they must do so using a QR code scanner or a contact importer for verification.

Previously, Roblox required a government ID to verify a user was 13+ or 18+ to access certain chat features. The company is now introducing an alternative verification method. Users can submit a "video selfie," which an AI will analyze against a large, diverse dataset to estimate if the person is over 13. Similar AI-powered age estimation features have been tested by Google earlier this year and by Meta the previous year.

Alongside these updates, Roblox is rolling out additional tools such as online status controls, a "Do Not Disturb" mode, and enhanced parental controls for accounts linked to a teenager's profile.

Roblox has faced ongoing scrutiny over its approach to child safety. In 2018, reports surfaced of a seven-year-old's avatar being sexually assaulted in-game and a six-year-old being invited to a "sex room." A 2021 investigation by People Make Games accused Roblox's business model of exploiting child labor. The company faced a lawsuit in San Francisco in 2022 for allegedly enabling the financial and sexual exploitation of a 10-year-old girl. Further lawsuits in 2023 alleged the platform facilitated an illegal gambling ecosystem and had lax safety protocols that exposed children to adult content. Last year, a Bloomberg report highlighted the prevalence of child predators on the platform. That same year, Roblox stated it reported over 13,000 cases of child exploitation to the National Center for Missing and Exploited Children in 2023, leading to the arrest of 24 individuals suspected of targeting children through the game.

"Safety has always been foundational to everything we do at Roblox," stated Chief Safety Officer Matt Kaufman in the announcement. "Our goal is to lead the world in safety and civility for online gaming. We are dedicated to supporting experiences that are both deeply engaging and empowering for players of all ages, while continuously innovating how users connect and interact."