AI Moderation and Legal Frameworks in Child-Centric Social Media: A Case Study of Roblox
Mohamed Chawki ()
Additional contact information
Mohamed Chawki: Law Department, Naif Arab University for Security Sciences, Riyadh 11452, Saudi Arabia
Laws, 2025, vol. 14, issue 3, 1-38
Abstract:
This study focuses on Roblox as a case study to explore the legal and technical challenges of content moderation on child-focused social media platforms. As a leading Metaverse platform with millions of young users, Roblox provides immersive and interactive virtual experiences but also introduces significant risks, including exposure to inappropriate content, cyberbullying, and predatory behavior. The research examines the shortcomings of current automated and human moderation systems, highlighting the difficulties of managing real-time user interactions and the sheer volume of user-generated content. It investigates cases of moderation failures on Roblox, exposing gaps in existing safeguards and raising concerns about user safety. The study also explores the balance between leveraging artificial intelligence (AI) for efficient content moderation and incorporating human oversight to ensure nuanced decision-making. Comparative analysis of moderation practices on platforms like TikTok and YouTube provides additional insights to inform improvements in Roblox’s approach. From a legal standpoint, the study critically assesses regulatory frameworks such as the GDPR , the EU Digital Services Act , and the UK’s Online Safety Act , analyzing their relevance to virtual platforms like Roblox. It emphasizes the pressing need for comprehensive international cooperation to address jurisdictional challenges and establish robust legal standards for the Metaverse. The study concludes with recommendations for improved moderation strategies, including hybrid AI-human models, stricter content verification processes, and tools to empower users. It also calls for legal reforms to redefine virtual harm and enhance regulatory mechanisms. This research aims to advance safe and respectful interactions in digital environments, stressing the shared responsibility of platforms, policymakers, and users in tackling these emerging challenges.
Keywords: content moderation; Roblox; artificial intelligence; legal frameworks; online safety (search for similar items in EconPapers)
JEL-codes: D78 E61 E62 F13 F42 F68 K0 K1 K2 K3 K4 (search for similar items in EconPapers)
Date: 2025
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
https://www.mdpi.com/2075-471X/14/3/29/pdf (application/pdf)
https://www.mdpi.com/2075-471X/14/3/29/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jlawss:v:14:y:2025:i:3:p:29-:d:1642203
Access Statistics for this article
Laws is currently edited by Ms. Heather Liang
More articles in Laws from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().