Roblox

Roblox Corp. has rejected assertions in a lawsuit filed by the Louisiana attorney general that its gaming platform facilitates the exploitation of young children, calling them categorically untrue and vowing to work with state officials to protect kids from harm.

Murrill.jpg

Murrill

Last week, Attorney General Liz Murrill filed a petition in the 21st Judicial District Court in Livingston Parish for an injunction and penalties against Roblox, which operates a popular gaming site geared for children and teens. Murrill filed the lawsuit in the wake of parish law enforcement officers last month arresting a person who was active on the Roblox site on suspicion of possessing child sexual abuse material.

“Defendant (Roblox) has permitted and perpetuated an online environment in which child predators thrive, directly contributing to the widespread victimization of minor children in Louisiana,” the lawsuit states.

A Roblox spokesperson, however, told the Louisiana Record that the company is diligently working to keep underage site users safe and will work cooperatively with Murrill’s office. The charge that Roblox is not dedicated to keeping children safe is “categorically untrue,” the spokesperson said.

“We hold ourselves to the highest standard and work constantly to remove violative content and bad actors,” the spokesperson said. “And we continue to innovate and add new safety features regularly in an effort to help protect all users on Roblox. We share Attorney General Murrill’s urgency to help keep kids safe because safety has always been our priority.”

To improve safety, the platform has added more than 50 new features to protect its youngest users and give their parents and caregivers more control over what their children can access on Roblox. The controls include ways to block content based on maturity ratings, to remove and report people on a child’s friends list and to set limits on screen time per day.

And this month, the company debuted Roblox Sentinel, a system powered by artificial intelligence (AI) that is designed to flag child-endangerment interactions for human intervention, according to the company.

The lawsuit pointed to reports that Roblox hosted various “Diddy" games, with titles such as "Survive Diddy," “Run from Diddy Simulator" and "Diddy Party," all of which seem to be based on incidents of abuse involving the music mogul Sean Combs, also known as "Diddy." Roblox, however, has stressed that “Diddy” experiences run counter to company policies and that a team is working to remove such content from the platform.

A Trust & Safety team now works quickly to remove inappropriate content from the site using ongoing 24/7 surveillance, user reports and AI scans, according to Roblox. The company maintains that though nearly two-thirds of its users are 13 or older, the platform’s policies are more strict than what is found on social media networks or other platforms with user-generated content.

The company also reports that it uses advanced age-estimation technology to scrutinize the true ages of users and provide appropriate experiences and ensure safety.

Roblox stressed that it works in conjunction with law enforcement agencies, mental health authorities and parental advocacy groups to deal with serious threats. Last year, the company submitted nearly 25,000 reports to the National Center for Missing and Exploited Children (NCMEC), which represents 0.12% of the 20.3 million reports that reached the NCMEC.

“No system is perfect and bad actors adapt to evade detection, including efforts to take users to other platforms, where safety standards and moderation practices may differ,” Roblox said in a statement on its website. “We continuously work to block those efforts and to enhance our moderation approaches to promote a safe and enjoyable environment for all users.”

More News