Roblox is rolling out a broad overhaul of its child safety settings, introducing new age-based account types as the platform faces pressure from Australian regulators and renewed concern from parents over the reliability of its age-check system.
The gaming platform said it will launch Roblox Kids and Roblox Select globally from late May, using facial age-estimation technology to determine what users can access, who they can talk to, and what kinds of games they can play.
What is changing on Roblox?
The update expands Roblox’s existing age assurance system beyond chat functions and into core account access. Users estimated to be under nine will be placed into Roblox Kids, which limits access to curated games and turns communication tools off by default.
Users aged nine to 15 will be placed into Roblox Select, which allows a broader but still restricted content library, along with age-matched communication features. Users who do not complete an age check will be limited to children’s content and will not be able to use communication features.
Roblox also plans to adopt Australian Classification Board age ratings later this year, as it adds another layer of content controls for younger users.
Australian scrutiny has intensified
The changes come after months of scrutiny in Australia, where eSafety Commissioner Julie Inman Grant placed Roblox on notice in February following reports of child grooming and sexual exploitation on the platform.
Australia’s eSafety office opened an investigation into Roblox’s safety practices in July 2025 and said it would continue to assess the company’s compliance with the Online Safety Act and newly commenced age-restricted material codes.
Roblox was not included in Australia’s under-16 social media ban, a decision critics have described as a loophole given the large number of children and teenagers who use the platform.
Parents raise concerns over age-check errors
While Roblox says the system is safer than relying on self-reported birthdays, some parents have raised concerns that facial age estimation can misclassify children and place them into less restricted versions of the service.
Matt Kaufman, chief safety officer at Roblox, said the company’s technology estimates age within around 1.4 years, plus or minus, for users under 18. He argued that asking users to enter their age is less reliable, because some children will adjust their details to unlock extra features.
Kaufman also said Roblox has seen cases where parents helped children bypass checks. According to the company, it monitors account behaviour for signs that a user may be younger than initially estimated and prompts some users to complete the process again.
Roblox said parents can appeal decisions, reset age checks, or use ID verification to correct errors. It also plans to give parents more controls to block games and manage direct messages until a child turns 16.
How Roblox will decide what younger users can see
To determine which games are suitable for younger users, Roblox said it will use a mix of moderation signals, developer history, platform behaviour, and content screening. Games with social hangouts, free-form drawing, or sensitive themes will not be available by default in the new child and teen account tiers.
Developers who want their experiences included for younger users must also complete extra checks, including ID verification and two-factor authentication. New games will first be tested by verified users aged over 16 before they are made available more broadly.
Roblox says it has more than two million developers building experiences on the platform, which adds to the complexity of moderating content at scale.
Commercial pressure meets platform safety
The company has framed the changes as part of a longer-term safety strategy rather than a direct response to government threats. Kaufman said Roblox views trust and safety investment as central to its growth, arguing that stronger protections help build confidence among users and families.
That argument is now being tested in public. Roblox remains under investigation in Australia, is facing criticism from child safety advocates, and is also defending itself overseas as lawmakers and parents push technology platforms to do more to protect younger users online.
