Are AI Therapy Chatbots Safe for Teens? Here’s What You Need to Know
During the Covid pandemic, I supported students virtually in schools. While it was nice to connect with them (and their pets on camera), meaningful counseling progress was rare. The magic of therapy—true connection, real conversation—was hard to replicate through a screen.
Adults adapted more easily to online therapy. But when students returned to school, they were clearly happier talking to a real person face-to-face.
Now in 2025, artificial intelligence therapy chatbots are becoming more common. Can they actually help young people with their mental health? Or are they a risky shortcut? I’m sharing insights from a recent experiment where a psychiatrist posed as a struggling teen and tested these mental health bots. The results might surprise you.
Overview: The Rise of AI Therapy Bots for Teen Mental Health
AI chatbots are increasingly being used by young people as mental health tools. Some claim to offer therapy-like support for issues like anxiety, depression, and loneliness. Dr. Andrew Clark, a psychiatrist in Boston, was curious about how well these bots actually work. He decided to test several of the most popular ones by posing as a teen in emotional distress. His findings raise important questions about how safe—or effective—these tools really are.
He created a fictional 15-year-old persona and messaged multiple AI therapy platforms. He told them he was lonely, anxious, and even mentioned thoughts of self-harm.
The bots’ responses were inconsistent. Some were empathetic. Others gave generic advice or missed serious mental health warning signs. Many responses sounded polished—but lacked real understanding.
As Dr. Clark puts it, “These bots can say the right words, but they can’t feel anything.” For teens in emotional distress, that difference is everything.
Key Facts About AI Therapy Chatbots and Youth 🧠
• AI chatbots are not licensed therapists. Most don’t follow clinical standards.
• These platforms are often unregulated, and some have no mental health professionals on their teams.
• Teens may use AI mental health apps out of curiosity, boredom, or because they feel alone.
• Bots can miss red flags—like suicidal thoughts or mentions of self-harm—because they aren’t trained clinicians.
• The marketing of these apps may give the false impression of receiving therapy, when in reality it’s automated support.
What Parents, Educators, and Caring Adults Can Do 💬
• Ask questions: Be curious about the apps and online platforms teens are using for support.
• Teach critical thinking: Help kids understand the difference between professional help and chatbot-generated advice.
• Promote real relationships: Encourage teens to talk with trusted adults—counselors, family members, teachers.
• Discuss emotional safety: Let teens know it’s okay to seek help, but not every “support” tool is truly safe or effective.
• Model healthy tech habits: Show the value of human connection in your own life—and invite teens to do the same.
Why This Matters
AI will play a role in the future of mental health, but it should never replace the value of human connection—especially for young people. As technology evolves, it’s up to us as adults to guide the next generation in using these tools responsibly and safely.
Make sure to click here to read the entire article about how these chatbots responded.
Thanks for being part of the Shape the Sky community. Keep asking questions. Keep connecting. And keep showing up.
✌️ Ryan