Roblox’s new age-check requirement
Digest more
Roblox, one of the world's most popular games for children, is taking a controversial new step to combat the pervasive issue of online predators in its chat rooms.
If Roblox users want to chat on the online gaming platform, they’ll soon have to verify their age by providing a government ID or by letting an artificial intelligence age estimation tool photograph their face.
Roblox must also account for bad actors who try to use fake photos or pictures of another person to evade accurate age estimation. Kaufman said that the combination of video and still imaging helps prevent that. Additionally, the company will deploy fraud checks.
The move comes amid multiple lawsuits, including from the attorneys general of Texas and Louisiana, against Roblox over child safety concerns. The lawsuits were filed following reports that Roblox was exposing young users to dangerous risks,
Facing pressure from states and families, Roblox will require players to use AI-powered technology that helps verify their age.
Roblox is getting ready to roll out its latest safety feature at a time when it is confronting a growing number of lawsuits from states and families that claim the popular gaming app falsely advertised itself as safe for children.
Roblox will now require age checks for communications between users, with facial age estimation (FAE) as a biometric option.