
Over the past year, the role of chatbots in teens’ lives has shifted dramatically. What was once a novelty, used for homework help or emotional support, has become a more complex and sometimes concerning part of our students’ digital lives. With new California legislation aimed at regulating AI interactions with minors, and growing awareness of the mental health risks tied to chatbot dependency, it’s clear that educators and families must stay engaged. As AI tools become more emotionally intelligent and widely accessible, the need for open, thoughtful conversations with students is more urgent than ever. It’s been a year since we’ve posted about
The Pros and Cons of Chatbot Use for Teens, and we wanted to include a legal update and resources to provide a fresh lens on social-emotional wellness, digital boundaries, and the importance of human connection in an AI-driven world.
California Legal Updates on AI Chatbots
California has become the first state to enact comprehensive legislation addressing the safety of AI companion chatbots used by minors. In October 2025, Governor Gavin Newsom signed Senate Bill 243, which will take effect on January 1, 2026.
The law, SB 243, is designed to protect children and vulnerable users from some of the harms associated with AI companion chatbot use. The new law states:
- Chatbots must clearly disclose they are AI (not human).
- Minors must acknowledge notifications every three hours during prolonged interactions, reminding them it’s AI and to take a break.
- Operators are required to prevent chatbots from producing content involving self-harm, suicide, or sexual behavior when interacting with minors and must have crisis response protocols in place. This includes referring students to suicide prevention hotlines and publishing their protocols publicly.
- Companies must adopt reasonable measures to block chatbots from showing sexually explicit content or suggesting sexual activity with minors.
SB 243 establishes legal recourse for families, allowing them the ability to sue companies if safety protocols are ignored. Developers must publish details of their crisis protocols and annually report suicide risk interventions to the state health department.
California’s proactive stance on creating these guardrails for students is critical as developers continue to create chatbots that grow in sophistication and reach.
Updated Resources to Learn More:
- Common Sense Education Report – Talk, Trust, and Trade-Offs: How and Why Teens Use AI Companions
- Psychology Today Article – AI Companions and Teen Mental Health Risks
- ABC News – As Teens Turn to AI Companions for Support, Experts Share How Parents Can Respond
- Stanford Report – Why AI Companions and Young People Can Make for a Dangerous Mix
- Screenagers Article – 5-Step Plan To Help Kids and Teens with AI Companion Chatbots
Given the overwhelming research that points to concerns surrounding AI chatbots and the fact that teens’ interactions with AI can shape their emotional growth, for better or worse, it is crucial for educators and parents to have conversations about their students’ chatbot experiences.
These discussions can include:
- Exploring emotional impact: Ask students how their interactions made them feel, what they learned, and how those experiences differed from talking with real people.
- Comparing support systems: Talk about the difference between AI-generated responses and genuine support from trusted adults or friends, reinforcing the value of empathy, social skills, and human connection.
- Setting healthy boundaries: Collaboratively establish realistic time limits for chatbot use and set clear expectations and goals to promote balanced digital habits.
While AI chatbots can offer comfort, they are not a substitute for human connection. By continuing to have meaningful conversations, setting healthy boundaries, and encouraging critical thinking and reflection, educators and families can help students navigate AI chatbots with a greater understanding and emotional awareness. Since research indicates a high level of chatbot use by teens, we can help ensure that it complements, rather than replaces the in-person support that is so crucial to adolescent development.

