News
Roblox Launches AI Age Verification as Predator Lawsuits Mount
Roblox, one of the world’s most popular gaming platforms for children, is rolling out a sweeping new safety initiative requiring all users to verify their age through a government ID or an AI-powered facial age estimation tool. The move comes amid growing legal pressure and a wave of lawsuits accusing the company of enabling predators to contact and groom minors on the platform.
With more than 150 million global users, a third of them under 13, Roblox has long marketed itself as a safe creative sandbox for young coders and gamers. But recent investigations and legal actions have thrust the platform into the spotlight, highlighting troubling cases of grooming, abuse, and even kidnappings linked to adult predators who connected with children through Roblox.
Lawsuits Mount as Attorneys General Sound the Alarm
This year, attorneys general from Kentucky and Louisiana filed lawsuits accusing Roblox of harming children by failing to protect them from predators. Florida’s attorney general issued a criminal subpoena, publicly calling Roblox a “breeding ground for predators.”
Individual families have also come forward, including the mother of 15-year-old Ethan Dallas, who died by suicide after allegedly being groomed through Roblox and Discord. These legal challenges have intensified public scrutiny and fueled calls for stricter accountability.
Roblox’s New Safety Strategy: Scan Your Face or Upload ID
In response, Roblox is implementing what it calls a “new industry standard” for online safety. Under the updated policy:
-
All users must verify their age to access chat.
-
Users can submit a government ID or allow the AI tool—developed with verification firm Persona—to analyze their face using the device’s camera.
-
The system sorts players into age groups: Under 9, 9–12, 13–15, 16–17, 18–20, and 21+.
-
Users will only be able to chat with others in their own or adjacent age ranges, significantly reducing contact between adults and minors.
For example, a 12-year-old can chat with players 15 and under, but not with anyone 16 or older.
Roblox says the images used for face verification are deleted immediately after processing and are not stored.

Roblox AI Age Verification
A Global Rollout and Industry Ripple Effects
The age-check system begins voluntarily this week before becoming mandatory in Australia, New Zealand, and the Netherlands in December, and globally by early next year. Roblox believes this approach will set a precedent for other platforms—especially as Meta, YouTube, and others also experiment with AI-driven age verification tools.
Roblox Chief Safety Officer Matt Kaufman said the company aims to create “a safe, positive, age-appropriate experience for everybody.”
But critics warn that AI face-scanning raises questions about privacy, accuracy, and accessibility—especially after reports that users on other platforms have bypassed facial age checks using images of video game characters or other people.
Roblox insists its technology includes “robust fraud checks” to ensure users are real, live, and following verification prompts.
As scrutiny grows and lawsuits pile up, Roblox’s new AI-driven safety overhaul represents one of the most aggressive attempts yet to protect minors on a major platform. Whether it becomes the new standard—or ignites further controversy—remains to be seen.

