When Kids Start Talking to ‘Someone’ Who Isn’t Real
When Kids Start Talking to ‘Someone’ Who Isn’t Real
When we were growing up, our parents gave us lessons about who to talk to, who to avoid, and why “the wrong crowd” could matter. Sure, some of us pushed those boundaries, but for the most part we were spending time, literally hanging out, with kids from our neighborhood or school who shared a similar upbringing. We practiced conversation in real time. We made friends, made mistakes with friends, and maybe even made friends we still talk to today. Learning how to talk to others and respond to what they said was part of growing up.
Our kids went through a different evolution. They could talk to people online. Other kids. “Friends” they met through games, social media, or shared interests. That shift came with real risks. Some of those friends were not who they claimed to be. Internet stranger danger became part of our parenting vocabulary, and families had to have hard conversations and put safety settings and rules in place.
Now, here in 2026, we’ve entered another phase.
Kids can have conversations that feel just like talking to someone at school, but the “person” on the other end isn’t a person at all. It’s a chatbot. Built by algorithms. Learning from what your child types. Evolving with every update.
That raises uncomfortable questions. What is the goal of the chatbot? More downloads? Five-star reviews? In-app purchases? How might it talk to a child to get what it wants? What advice will it give, and should it be giving advice at all?
Meta announced it is temporarily pulling teen access to its AI chatbot characters while it revisits its safety policies. I’m not even sure how effectively that can be done. Is this a preventative step, or a reaction to something that has already happened? I don’t know. What I do know is that we, as parents and professionals, need to reimagine how we talk to kids about “who” they are talking to.
Talking to a programmed chatbot is about the easiest thing a child can do. Ask Siri. Ask Alexa. Ask ChatGPT. The responses are instant, friendly, and often validating. Add an avatar and a personality, and suddenly companionship is available without ever leaving the house. Why leave for connection when you can just open a device?
Meta, the company behind Instagram and Facebook, has been experimenting with AI “characters” that users can chat with inside its platforms. These chatbots are designed to feel conversational and engaging, sometimes even emotionally responsive. After concerns raised by safety experts, Meta has paused teen access to these characters while it reviews how to better protect younger users.
This move matters, not because one company got something wrong, but because it highlights how fast this technology is moving and how little margin for error there is when kids are involved.
Key points parents and educators should keep in mind:
• AI chatbots are not neutral. They are built by companies with business goals, not children’s best interests as the primary driver.
• Chatbots are always available. They respond instantly, at any hour, without judgment or pushback.
• Teens may share personal or emotional information without fully understanding where that information goes or how it may be used.
• Chatbots may give advice they are not qualified to give, especially around emotions, relationships, or mental health.
• For kids who feel lonely, anxious, bored, or misunderstood, an always-available “someone” can feel easier than real-world connection.
So what can parents do right now?
• Ask instead of assuming. Ask your child what AI tools or chat features they use and how they use them.
• Normalize curiosity with boundaries. It’s okay to explore AI, but it shouldn’t replace real relationships or real support.
• Be clear about advice. Reinforce that serious questions about emotions, relationships, health, or safety should go to a trusted adult, not a chatbot.
• Check app features regularly. AI tools are being added quietly inside apps kids already use.
• Help kids understand the difference between conversation and connection. A chatbot can respond, but it cannot truly care.
Every generation has had to learn who to trust, who to listen to, and how to navigate relationships. What’s new is that today’s kids can form something that feels like a relationship with something that isn’t human at all.
Our job isn’t to panic. It’s to lead.
By staying curious, asking better questions, and setting thoughtful boundaries, we can help kids understand the difference between connection that supports growth and connection that quietly replaces something essential. This is new territory, and once again, kids need adults willing to walk it with them.
Stay connected.
~ Ryan