Texas sues Roblox for allegedly failing to protect children on its platform
TL;DR
Texas AG Ken Paxton is suing Roblox for allegedly failing to protect children from predators and deceptive safety claims, citing grooming and exploitation cases. Roblox denies the allegations, citing numerous safety measures and industry-wide challenges.
Texas AG Ken Paxton is accusing Roblox of “putting pixel pedophiles and profits over the safety of Texas children,” alleging in a lawsuit filed this week that it is “flagrantly ignoring state and federal online safety laws while deceiving parents about the dangers of its platform.”
The lawsuit accuses Roblox of deceptive trade practices for misleading parents and users about its safety features, and for creating a common nuisance by harboring a space “that has become a habitual destination for child predators engaging in grooming and child sexual exploitation.”
The lawsuit’s examples focus on instances of children who have been abused by predators they met via Roblox, and the activities of groups like 764 which have used online platforms to identify and blackmail victims into sexually explicit acts or self harm. According to the suit, Roblox’s parental controls push only began after a number of lawsuits, and a report released last fall by the short seller Hindenburg that said its “in-game research revealed an X-rated pedophile hellscape, exposing children to grooming, pornography, violent content and extremely abusive speech.”
In August, Louisiana filed a similar lawsuit, alleging that Roblox “permitted and perpetuated an online environment in which child predators thrive.” A couple of months later, the state of Kentucky also sued Roblox, calling it “a hunting ground for child predators.” Last month, Florida Attorney General James Uthmeier subpoenaed Roblox over similar allegations. It’s not just states suing Roblox, either. Numerous families and Roblox players have also sued the platform for alleged abuse, such as the cases detailed in Texas’s lawsuit.
Eric Porterfield, Senior Director of Policy Communications at Roblox, responded to the lawsuit in a statement to The Verge, saying, “We are disappointed that, rather than working collaboratively with Roblox on this industry-wide challenge and seeking real solutions, the AG has chosen to file a lawsuit based on misrepresentations and sensationalized claims.” He added, “We have introduced over 145 safety measures on the platform this year alone.”
Roblox reported in September that it has over 111 million daily active users, many of whom are children. Earlier this year, Roblox announced plans to roll out age verification using IDs and facial scans, along with an AI system intended to “detect early signals of potential child endangerment.”
It echoes similar changes on social media platforms like Discord, which also began rolling out age verification this year and has even been cited in some of the same lawsuits filed against Roblox, including one case involving a 13-year-old from Texas. Social media platforms have often successfully used Section 230 to shield themselves from liability for individual users’ actions on their platforms, however — a barrier this Roblox suit, like others against the company, will face.