Tuesday, November 25, 2025
HomeTechnology & ScienceRoblox blocks children from chatting to adult strangers

Roblox blocks children from chatting to adult strangers

Roblox, the popular online gaming platform, is implementing new safety measures that will block children from chatting with adult strangers using facial age estimation technology. This move comes in response to growing concerns and lawsuits over child safety and grooming on the platform.

The new system requires users to undergo a facial age check using their device’s camera before accessing chat features. Images are processed externally and deleted immediately after estimation, ensuring privacy. Users are then categorized into age groups: under nine, 9 to 12, 13 to 15, 16 to 17, 18 to 20, and 21 and over. This stratification allows communication only within similar age brackets, unless users are added as “trusted connections” for known contacts.

Rollout begins in early December 2025 in Australia, New Zealand, and the Netherlands, with a global expansion scheduled for January 2026. Roblox claims this makes it the first major gaming platform to enforce such age checks for communication, setting a precedent in the industry. The phased approach allows for testing and adjustments before worldwide implementation.

This decision follows a series of lawsuits in the United States, including cases in Texas, Kentucky, Louisiana, and Nevada, alleging that Roblox’s previous systems enabled predators to groom children. For instance, a lawsuit describes how a 13-year-old girl was manipulated into sharing explicit content after being contacted by an adult posing as a child. These incidents have highlighted vulnerabilities in the platform’s safety measures.

In response, child safety advocates have welcomed the changes but urge rigorous enforcement. Rani Govender of the NSPCC emphasized that while the steps are positive, Roblox must ensure practical protections to prevent adult perpetrators from targeting young users. Similarly, Beeban Kidron of the 5Rights Foundation called for gaming companies to prioritize children’s safety in their services.

Roblox’s chief safety officer, Matt Kaufman, stated that the age estimation technology is accurate within one to two years for users aged five to 25, and expressed hope that other platforms would adopt similar measures. The company has also highlighted its existing safeguards, such as prohibiting image sharing and restricting external links, but acknowledges that no system is perfect.

The broader context includes the UK’s Online Safety Act, which mandates tech firms to protect children from online harms, with Ofcom overseeing compliance. Anna Lucas of Ofcom praised Roblox’s efforts, noting that platforms are now taking necessary steps, though more work remains. This shift reflects a growing industry trend towards enhanced digital safety for minors.

As Roblox moves forward, parents will retain control over their children’s accounts, including the ability to update age information post-verification. The platform, which averaged over 80 million daily players in 2024, with about 40% under 13, aims to balance user engagement with safety, potentially influencing other online services to follow suit.

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisment -

Most Popular

Recent Comments