Most people know Roblox as an online platform where you can play games made by developers; it’s an easy way to play with your friends without having to be with them in person. However, Roblox has been gaining a reputation for not being suitable for kids, which has led to parents removing their kids from the online platform.
Roblox is available to kids of all ages and has restrictions on accounts of users under 18 years old. They offer parental controls to accounts under 13 years old and only offer services such as voice chat and access to games with “moderate” content maturity labels to users older than 13.
The user’s age is trusted to follow the birth date of the account, later requiring a government-issued photo ID to confirm ages over 13.
Even with all of these precautions, Roblox continues to struggle with regulating content created on their site. Games such as “Boys and Girls Hangout,” which was developed to harbor online daters and predators. Other games such as “Scented Cons,” “Condos,” and “Conted Scent” are code names for “consent.” “Scented Cons” are games where users indulge in conversations and role-playing events involving inappropriate or suggestive behaviors that violate the platform’s rules. Roblox does not support these games; however, because developers continuously create similar games, Roblox finds itself stuck in a cycle of create and delete.
Most games that include inappropriate content are roleplay or hangout games; Roblox has restricted these forms of games to be for 18+ users in the attempt to protect minors on their platform. Roblox has continued to fight back against inappropriate content made available to underage accounts, hopefully putting an end to games not suitable for their demographic.
With a fanbase that includes both children and adults – but with the platform primarily marketed toward kids – one would expect Roblox to enforce strict moderation to ensure online safety for its younger users. However, Roblox’s moderation system often falls short, leaving children exposed to inappropriate content and interactions.
Age rating system
Roblox tries to implement a content maturity level system, categorizing games from minimals, mild, moderate, and restricted (17+). Minimal maturity level contains content for all ages; mild contains content with some violence and crude humor; moderate includes violence, realistic blood, and unpayable gambling content; restricted contains strong violence, realistic blood, romantic themes, and alcohol.
A user’s age, which is entered during an account’s creation, determines their default safety settings, such as accessible game content. As players age, Roblox notifies them about new features that become available. Parents can adjust these settings under Parent Controls or Content Restrictions in the account settings. Along with the maturity level system, games are based on the age of the user, varying from: 5+, 9+, 13+, and 17+.
Despite these precautions, the system is easy to bypass. Players can simply enter a false birthdate to gain access to older content. Moreover, some games rated minimal or mild have been found to include strong sexual undertones, making them inappropriate for the younger audiences they’re supposedly safe for.
In game chat
Both in game text chat and voice chat on Roblox suffer from inconsistent moderation – sometimes overly strict, while other times being too lenient
In text chat, Roblox uses an automated filter that censors inappropriate words by replacing them with hashtags (#). It flags anything vulgar, threatening, discriminatory, or sexual. Players under 13 (previously under 18 before a 2024 policy change) are prohibited from sharing anything that resembles personal information, such as names and numbers for addresses and phone numbers. Direct messaging is also restricted for younger users unless a parent grants permission. In some cases, younger players can be completely blocked from sending messages; for in game and direct messaging; due to parental controls.
However, these systems are far from foolproof. Players often find ways around the filter by substituting letters with similar looking characters (e.g., replacing “E” with “3”). This makes it easy to evade detection. Younger players could potentially leak personal information while older players use predatory language.
Voice chat
Roblox’s voice chat feature is officially restricted to players aged 13 and younger. For those over 13, they must provide a verified phone number or government-issued ID to use voice chat. However, children can easily bypass the system by using their parent’s identification.
Roblox’s voice chat moderation system relies on a hybrid system of human reviewers and AI, but given the platform’s enormous user base, consistency is a major issue. The AI can mistakenly ban players for harmless conversations – especially those speaking in other languages or using words misinterpreted by the algorithm. These bans often offer little to no explanation, making appeals difficult.
Conversely, genuinely harmful behavior, such as harassment or users blasting loud, disruptive sounds – often goes unpunished. False reports can also trigger unnecessary bans, creating further frustration.
Difficulty with human moderation
With over 100 million active users daily, the sheer volume of content on Roblox is impossible for human moderators to manage effectively. Reports suggest that the Trust & Safety team on Roblox receives far more complaints than they can review in a single day, resulting in delayed or automated responses.
Former Trust & Safety employees have said that requests for additional moderation resources were ignored by company leadership. As a result, user reports and appeals often receive only generic automated replies, leaving players confused and dissatisfied with the platform’s lack of transparency.
While Roblox provides tools such as age ratings, chat filters, and parental controls, these measures are often too easy to bypass and are inconsistently enforced. The combination of AI-driven moderation errors, inadequate human oversight, and limited transparency has created an environment where children can still encounter inappropriate content and unsafe interactions.
When children’s safety comes into question, parents and the government don’t just sit back and let it happen. Over the years since it rose to popularity, Roblox has faced several legal battles over children’s safety.
These lawsuits primarily began in 2022 in the United States, but have intensified this year as security concerns reach new heights.
Roblox v. Ruben Sim
Ruben Sim is a Youtuber who made Roblox content for the 800,000 subscribers he had between late 2021 and early 2022. In November of 2021, Roblox sued Sim for 1.6 million dollars, alleging that he violated platform rules, posted terrorist threats, harassed others, and posted offensive content. However, Sim argued that his videos were actually meant to expose illegal, predatory, and inappropriate content that Roblox was negligent enough to allow. Sim posted a video to his Youtube channel called “Roblox Tried To Sue Me For $1.6 Million.” In his video he claimed that the offensive content (one of the instances Roblox mentioned involved Sims posting a nude picture of a Roblox administrator) was not actually posted by him, he just referenced it in his videos. Sim also said that Roblox should be focused on defeating the actual predators making the platform unsafe for children, rather than suing him for showing concern about it. Despite this, Roblox and Sim settled the lawsuit out of court in January 2022. Sim paid Roblox 150,000 and is permanently banned from the platform. This case is controversial and since it was settled rather than going to trial, there are some gaps in the information – Sim’s actions are not totally clear. However, Roblox did not make any successful attempt to fix the concerns that Sim raised over child safety concerns, as it is still a major issue in 2025.
Florida Attorney General Uthmeier Issues Civil Subpoena
On April 16, 2025, the Attorney General (AG) of Florida issued a civil subpoena to Roblox. A subpoena is a mandatory court order that commands a person to either appear in court or produce documents or other similar types of evidence that could be used in court. In this case, AG Uthmeier asked for documents proving Roblox’s approach to marketing for children, age-verification, and chat moderation. After six months of investigation into these documents, AG Uthmeier concluded on October 20, 2025 that “Roblox is a breeding ground for child predators.”
Louisiana Attorney General Murrill sues Roblox
AG Murrill sued Roblox on August 18, 2025, claiming that the platform endangers the children of Louisiana. One example of this comes from Livingston County, Louisiana, in which a man used technology to alter his voice to sound like a girl. Additionally, a 13 year old girl was allegedly introduced to a predator and kidnapped across multiple states. AG Murrill believes Roblox should be shut down, which the platform disputes. Roblox says they do not encourage child exploitation and have measures in place to combat it. However, AG Murrill cites the lack of age verification once a user signs up as cause to drastically change or take down the app.
These are just some of the examples of lawsuits against Roblox over child safety concerns. Especially in 2025, civil cases against the platform have greatly increased. While Roblox insists that it is trying everything it can to fix the issues, something major is going to have to be addressed in order for the heat around the problem to die down.
China has strict gaming laws that put many limitations on games by government mandate. Roblox’s user-created content doesn’t comply with these regulations, so the game had to shut down in China.
The countries banning or blocking Roblox are overwhelmingly Middle Eastern. They cite unsafe content for children as the driving force behind the ban.
While nowhere close to banning Roblox, concerns have been raised in the United States about its safety, especially for children. In August of 2025, Louisiana Attorney General Murrill filed a lawsuit against Roblox over its alleged facilitation of child sexual exploitation.
There are millions of players on Roblox every day, and with so many players, it is difficult to regulate the inappropriate actions of a few. Multiple accounts have proven that Roblox players have gotten away with pedophilia and sexual abuse online; Roblox did not step in.
One player wouldn’t let such misconduct go unnoticed.
Schlep, whose real name is Michael, is a Texan YouTuber with over 2 million followers who took the regulation of Robloxian predators into his own hands. Schlep, with his team, would pose as underage kids and communicate with pedophiles.
When Schlep was 15, he was the target of a predator. One of the developers of his favorite game reached out to him, but this would soon become abuse. Schlep stated that the developer groomed him viciously, causing Schlep to attempt suicide. He asked Roblox to do something against such behavior, to prevent things like this from happening to other kids, but he received no answer.
Now 22, Schlep targets predators and has successfully arrested 6 of them. However, he was only able to do this for so long because on August 9, 2025, Schlep was banned from Roblox for going against Roblox’s terms of service. Roblox did this on the basis that Schlep was going under a false identity to lure predators into committing an illegal action, which would later be publicly shared on his YouTube channel. Roblox sent a cease and desist letter to Schlep, threatening legal action if he continues these measures.
This ban was unexpected and sparked outrage in parents who, once cognizant of Schlep’s efforts to protect kids online, were against the ban. The hashtag #FreeSchlep surfed social media around the time of the ban from the infuriated Roblox fans who hoped this would be a turning point for the online platform. Many Roblox users expected the platform to work with Schlep rather than outright banning him in the hopes of starting a new, predator-free era for Roblox.
Roblox came out with a form of age regulation on Roblox in an attempt to make their games safer. Depending on the age of the account creator, certain games and chats will be inaccessible, regulating the available content for minors. But these restrictions are easy to bypass, as the creator of the account can simply lie and gain access to games that should not be available to them.
Schlep’s ban caused many Roblox users to not trust Roblox and made adults more aware of the dangers of online social platforms. This controversy especially sparked the attention of US Congressman Ro Khanna, who started a petition for Roblox to do more to protect children, provide support to parents, and strengthen law enforcement against predators.
As if concerns over chat moderation, age verification, and offensive, inappropriate content being posted isn’t enough, the CEO of Roblox has discussed creating an online dating option for the platform.
Roblox CEO David Baszucki initially pitched this idea in 2023, at a Roblox Developers Conference. It gained popularity and sparked outrage in August of 2025 when Baszucki brought it up again to the public. He discussed his idea again during his appearance on the “Tech Stuff” podcast, hosted by Oz Woloshyn and Karah Preiss.
Baszucki wants to create the online dating feature to provide an opportunity for lonely people to make connections with each other. He also wants to ease the transition into the dating world by making it virtual first. On the “Tech Stuff” podcast, Baszucki told Woloshyn and Preiss that he “think[s] a lot of people who are too afraid to go on a real-life date might find it easier to have a virtual date to start, and then if they connect, move to the physical world.”
The online dating feature would only be available to players 21 and older, and they would be required to verify their age with valid identification.
Despite this verification step, Roblox becoming an online dating platform would upset parents and politicians alike. Seeing how well the platform handles other safety issues, adults are not optimistic that Roblox can successfully keep children off the dating feature.
Since Baszucki hasn’t set an official date for this feature to arrive on Roblox, there are many details that are still unconfirmed. Neither he, nor any other representative for Roblox has explained what identification would be required for verification, or what the verification process would look like. If they have something planned, it will have to be significantly more effective than the current system, because there have been multiple instances of adults or teenagers finding ways to subvert AI moderation and pose as children. It would not be out of the question for children to figure out how to pose as adults and access the online dating feature unless verification security was upped.
Apps that are solely dating apps such as Hinge, Tinder, and Bumble have their own security measures to make sure users are safe. Hinge uses selfie verification with 3-D face authentication that proves the user is the same person as they show in their photos. Additionally, Hinge has safe message filters that scan conversations for inappropriate content. Tinder has privacy settings that allow users to choose who can access what on their profile, chat filters and moderation, blue checks next to verified profiles, and a reporting feature. Bumble uses most of the previous features and also has an AI blurring feature to censor nude photos, as well as in app calls, so users don’t need to give out their real phone numbers.
If Roblox implements these features correctly and ensures that only people 21 and older use the app for online dating, there shouldn’t be a safety problem. However, considering the platform’s track record with these situations, the restrictions are unlikely to run smoothly. Additionally, anyone who is looking to participate in online dating can just sign up for one of the established platforms for free, rather than using a platform like Roblox.
Roblox’s ineffective moderation allows children to be influenced by destructing behaviors, eroding their mental health and development. Children can be affected by the games they play to the players they interact with.
Roblox has multiple games called “simulator games,” in which the player has to make progress by doing a certain task multiple times. This includes games such as Pet Simulator and Speed Simulator. The tasks can be long and tiring, so these games use a reward system that encourages players to spend money to progress. These games use in-game prices that can be purchased with Robux, the platform’s virtual currency. The need for these items can cause players, typically younger ones, to spend carelessly, enforcing bad spending habits.
Moreover, such games also offer a “mystery box,” where the players have a certain percentage of winning a common, uncommon, rare, or “epic,” and legendary item. The legendary items are usually close to impossible to win. However, children would feel the need to purchase these items to do better in the game and impress other players. This leads to kids purchasing a mystery box item multiple times, creating an unhealthy and obsessive cycle of spending.
Children also forget that in game currency corresponds to real life money. Converting real money to in game money creates a psychological disconnect which can then increase spending. Limited and exclusive offers trigger quick purchasing reactions before younger players can understand the real-world cost.
Similar to other video games and media platforms, children on Roblox can experience hurtful comments and harassment from other players. The anonymity of the internet often emboldens bullies, making the experience distressing. Over time, cyberbullying can seriously affect a child’s mental health, leading to depression and anxiety.
To worsen the general effect of video games, Roblox’s lack of effective moderation over sexual and violent content can desensitize children to harmful behavior, while also exposing them to fear, anxiety, and even aggressive or manipulative conduct they may imitate. Some players experience grooming, where predators build false trust to extract personal information or exploit children sexually. Exposure to such extreme and inappropriate content has been linked to panic attacks and lasting trauma.
Without adequate oversight, children can be influenced by these negative interactions, which can normalize aggressive, manipulative, and predatory behavior and desensitize them to danger. This lack of protection leaves younger users vulnerable to trauma and exploitation, highlighting the urgent need for stricter safety measures on the platform.




