Whitney Ray Di Bona
ORLANDO — More than 145 lawsuits filed against the online gaming platform Roblox are drawing increased scrutiny on how tech companies protect children online, with the plaintiffs alleging the company failed to implement adequate safeguards against predatory behavior and exploitation on its platform.
Whitney Ray Di Bona, an attorney and consumer safety advocate, said the lawsuits focus heavily on claims that Roblox made it too easy for adults to contact children and too difficult for parents to monitor or intervene in online interactions.
“The main claims are that Roblox made it too easy for predators to contact children and too difficult for parents to step in,” Di Bona said in an interview with The Florida Record.
She noted that until 2024, adults were allegedly able to directly message children of any age through the platform.
The lawsuits further claim children could create accounts without parental knowledge and that Roblox rejected proposals that would have required parental approval for account creation.
The litigation also centers on allegations that predators used Roblox’s in-game currency, Robux, to establish trust with children before directing them to less-regulated platforms such as Discord, where plaintiffs claim exploitation escalated outside Roblox’s oversight.
Di Bona said attorneys involved in the cases are particularly concerned by what they describe as organized patterns of grooming behavior occurring through the platform.
According to the allegations, predators posed as peers, used virtual gifts and Robux to build relationships with children and then encouraged minors to move conversations to other apps.
“Once that switch happens, Roblox’s protections disappear,” Di Bona said, describing what attorneys call a “gateway problem” at the center of many of the lawsuits.
The cases represent part of a broader wave of litigation aimed at technology and social media companies over child safety concerns.
Di Bona said the Roblox litigation differs from earlier lawsuits involving companies such as Meta and YouTube because Roblox was specifically designed and marketed toward children, including users as young as five years old.
“These lawsuits are part of a growing trend, and the legal landscape is shifting quickly in favor of plaintiffs,” she said.
Di Bona added that legal experts increasingly view the Roblox multidistrict litigation as a significant new mass tort involving child safety online, comparable in some ways to ongoing social media addiction litigation.
She also pointed to recent jury verdicts involving Meta that plaintiffs believe could help establish legal precedent for proving liability in cases involving child exploitation and mental health harms connected to online platforms.
Central to the lawsuits are questions surrounding Section 230 of the Communications Decency Act, the federal law that generally shields online platforms from liability for user-generated content.
The plaintiffs argue, however, that Roblox’s own platform design decisions contributed to the alleged harms, creating a distinction from cases focused solely on third-party user conduct.
“Plaintiffs argue that Roblox’s own design choices — not just user content — caused the harm,” Di Bona said.
She also noted that several state attorneys general have pursued claims under consumer protection laws, alleging Roblox misrepresented the safety of its platform to users and families.
Di Bona said the litigation could ultimately help establish a new legal standard for what constitutes reasonable child protection measures for online platforms.
The lawsuits are also fueling broader discussions about government regulation of technology companies and child safety requirements online.
Di Bona pointed to a recent settlement in Nevada that required Roblox to adopt additional safety measures in addition to financial penalties. Combined settlements involving states total approximately $35 million, she said.
At the federal level, Di Bona said recent verdicts against Meta have intensified bipartisan discussions in Congress surrounding potential reforms to Section 230 and new child safety legislation.
She said the outcome of the Roblox litigation could have implications far beyond a single company.
“If courts decide that platform design choices — not just user actions — can create legal responsibility, every tech company making products for children will have to seriously rethink their approach,” she said.
Roblox has begun implementing new safety measures amid the litigation, including age-based account restrictions scheduled to roll out in June for users ages 5 to 15.
The updated system is intended to place limits on communication features and content access based on a user’s age.
Di Bona described the changes as “a positive step,” but said critics remain concerned about the effectiveness of the platform’s safeguards.
She pointed to Roblox’s facial age-verification system introduced earlier this year, which she said users were reportedly able to bypass using altered images.
“Good intentions on paper do not protect children in real life,” Di Bona said.
She urged parents to remain actively involved in monitoring their children’s online activity, noting that no platform should be considered completely safe for unsupervised minors.
Parents, she said, should pay attention to secrecy surrounding online behavior, unfamiliar online friends and unexplained gifts received through games.
“Predators rarely stay on Roblox,” Di Bona said. “They use it as a first contact and then try to move children to other apps quickly.”
She added that understanding every platform children use across their devices “is not optional” for families attempting to protect minors online.
