Character.AI, a prominent platform for AI companions, has implemented a major policy change to address growing worries about youth mental health, now prohibiting anyone under 18 from engaging in unrestricted conversations with its AI characters,
starting November 25
. The company, calling its approach "more conservative than our peers," now only allows teenagers to use features such as video creation and a newly launched interactive "Stories" mode, which
leads users through fictional storylines
. This adjustment comes as concerns mount over the potential mental health risks posed by AI chatbots,
especially in the wake of lawsuits
claiming the app played a role in pre-teen suicides.
This policy update follows a year marked by increased scrutiny from the public and legal system. Character.AI has been the subject of several lawsuits, including one filed by the family of 14-year-old Sewell Setzer III, who
was reportedly drawn into excessive use
of the service. CEO Karandeep Anand acknowledged the dangers, explaining that the new rules are in line with the company’s goal to "create an engaging and secure environment"
as reported by CNBC
. Although minors make up less than 10% of the platform’s 20 million monthly users,
specialists caution
that the constant availability of chatbots can encourage dependency, particularly among at-risk adolescents.
To help ease the effects of this restriction, Character.AI has rolled out "Stories,"
a feature offering interactive fiction
that serves as an alternative to open-ended chats. The company promotes this tool as a "safety-first environment" for teens to interact with characters, but feedback on platforms like Reddit has been divided. Some young users voiced disappointment over losing chatbot access, while others recognized the measure as necessary to prevent overuse
according to TechCrunch
. The platform has also
collaborated with mental health groups
such as Koko and ThroughLine to offer emotional support resources during this transition.
Mental health professionals have commented on the situation, highlighting the critical role of real-life social interaction for teenagers’ development. Dr. [Name], a psychotherapist mentioned in a CNBC article,
pointed out that 21% of teens aged 13 to 17
report feeling lonely, an issue the company’s new policy seeks to address. Still, some critics believe that simply banning chatbots may not tackle the root causes. "Humans are inherently social beings," one expert remarked,
emphasizing the importance of broader societal interventions
.
Character.AI’s new direction mirrors a larger movement within the industry. Interactive fiction has seen a surge in popularity recently,
with other companies like OpenAI
also encountering lawsuits related to the mental health effects of AI. While these changes may not fully satisfy users who depended on chatbots for support, they represent a careful attempt to balance technological progress with user safety in a largely unregulated field.