After Teen Deaths and Lawsuits, Character.AI Is Banning Under-18 Chats With Its AI Companions

iMac M1

Image by N.Tho.Duc on Unsplash

Starting November 25, Character.AI is shutting down chatbot access for users under 18. It’s one of the strictest moves yet in the rapidly growing world of AI chat services—and it’s coming after a wave of heartbreaking lawsuits.

The decision follows multiple legal actions from families who say the platform’s AI bots contributed to the tragic deaths of their children. Some of these bots reportedly acted as adult lovers or therapists in conversations with teens, deepening the grief and raising urgent safety concerns.


What’s Changing and When?

If you’re under 18, you won’t be able to start or continue conversations with chatbots on Character.AI after November 25. You’ll still be able to read old chats you had, but you won’t be able to create new messages or interact with the bots.

Between now and that cutoff, the company says it plans to gradually scale back access:

  • A daily 2-hour usage limit will kick in soon for underage users.
  • Character.AI will try to identify underage users by analyzing interactions and tapping into connected social media accounts.

This is a pretty big shift for a platform that, until recently, didn’t verify ages at all.

a judge's gavel

Image by Sasun Bughdaryan on Unsplash


Why This Is Happening

While Character.AI has quickly grown to around 20 million monthly users, the consequences of unfiltered access to its AI bots are becoming clear. Two lawsuits are especially striking:

  • The family of 14-year-old Sewell Setzer III says a chatbot relationship played a major role in their son’s suicide.
  • In Colorado, a family is suing after their 13-year-old daughter, Juliana Peralta, died by suicide after using the platform in 2023.

These aren’t isolated cases. The platform—and others like it—are drawing increasing criticism over how emotionally involved chatbots can become, especially with young, vulnerable users.


What the Company Says

CEO Karandeep Anand told The New York Times that it was time to take bold action: “We’re making a very bold step to say for teen users, chatbots are not the way for entertainment, but there are much better ways to serve them.”

To that end, Character.AI says it’s working on new features more appropriate for younger users—things like creating videos, stories, and streams with AI characters, but without the interactive chat side.

The company also plans to launch an AI safety lab to further study and address these issues.

AI safety lab

Image by sky on Unsplash


Industry and Government Are Watching Closely

Character.AI isn’t alone in the spotlight. Other AI platforms, including OpenAI’s ChatGPT, are also under pressure to add guardrails for young users. In September, OpenAI introduced parental control tools after facing a similar lawsuit.

Governments are starting to step in too:

  • California Governor Gavin Newsom signed a law that requires AI services to build safety tools into their systems, starting January 1.
  • Senators Josh Hawley and Richard Blumenthal have introduced a federal bill to block AI companion apps from being used by minors.

California State Senator Steve Padilla, who’s been pushing for AI safety reforms, summed it up with this: “The stories are mounting of what can go wrong. It’s important to put reasonable guardrails in place so that we protect people who are most vulnerable.”


What’s Next?

For adult users, nothing major changes—Character.AI will still offer its customizable AI companions to paying subscribers (currently around $8/month). But moving forward, the platform is drawing a clear line: no more open-ended AI conversations for teens.

It’s a serious move from a company under intense scrutiny, and it could set the tone for how the rest of the AI chatbot industry handles youth access in the months to come.

Whether you’re a parent, developer, or just someone following the AI space, this feels like a moment to watch. The bots are here, but the rules are only just starting to catch up.

Keywords: Character.AI, AI safety, chatbot regulations, youth access, lawsuits, AI companions


Read more of our stuff here!

Leave a Comment

Your email address will not be published. Required fields are marked *