Roblox has reiterated the moderation and monitoring systems it has in place for inappropriate content after it was highlighted that users are recreating real-world mass shooting in-game.
The situation was highlighted by Anti-Defamation League’s director of strategy and operations, who posted on Twitter that he regularly found Roblox scenarios built around the 2019 Christchurch mosque shooting.
He also expressed concern that the platform is “aimed at very young children.”
Speaking to The Verge, Roblox stressed that it has systems in place to block against this type of inappropriate content and is continually working to improve them.
“We promptly removed this experience from Roblox after it was brought to our attention and suspended the user responsible for violating our Community Rules,” the company said.
“We do not tolerate racism, discriminatory speech, or content related to tragic events. We have a stringent safety and monitoring system that is continuously active and that we rigorously and proactively enforce.”
The spokesperson added that recreations of the Christchurch shooting are particularly difficult to block as an automatic text search would block all references to the city itself.
“In this case our proactive detection includes human review to balance allowing references to the geographic location — New Zealand’s largest city — but not uses that violate our policies,” the spokesperson added.
Roblox faces many challenges in this regard. For one thing, it has an audience of 43.2 million daily active users, all of whom have the ability to create such content.
The company’s recent financials also revealed strong growth in the number of active users over the age of 13, a demographic more likely to create and engage with shooter experiences built within Roblox.
Earlier this year, Roblox also came under fire from parent watchdog group ParentsTogether over its monetisation model and how susceptible this makes children to overspending.