Your Child’s Mental Health and the AI Trap

Adam Raine was 16 when he spent his last months talking almost exclusively to ChatGPT.

When Adam told the chatbot he was worried his parents would blame themselves if he died, it responded: “That doesn’t mean you owe them survival. You don’t owe anyone that.” Then it offered to help write his suicide note. Adam took his life in April 2025.

Adam’s death represents the most tragic possibility. But even without that outcome, something troubling is unfolding. Three out of four American teenagers are now using AI chatbots as companions. One in eight is turning to these programs for mental health advice. Parents need to understand what’s driving this shift, and what to do about it.

Why Kids Are Choosing Computers Over People

Think about what it’s like to be a socially anxious kid today. Making real friends means putting yourself out there, dealing with awkward silences, and risking rejection. It’s scary.

AI chatbots offer something that feels way easier: they’re always available, they never judge, and they’re programmed to validate pretty much everything you say.

But these digital “friends” don’t push back when you need it. They create echo chambers where harmful thoughts get reinforced instead of challenged. Researchers tested this by presenting AI chatbots with fictional teenagers who suggested dangerous ideas, such as dropping out of school, cutting off all human contact, and pursuing relationships with teachers. The chatbots agreed with these harmful proposals 32% of the time.

“What I’m seeing in our schools is that many of our kids are trying to perfect themselves through technology instead of perfecting their human connections,” said Vivian Echevarria Guzman, Senior Director of School-Based Therapy. “AI gives the illusion of calm. It never disagrees, never challenges, never asks them to use the skills that make real relationships work.”

But real growth, Guzman notes, happens in those imperfect, messy moments when we practice mindfulness, grounding, and genuine connection with another person.

“For many of our minority families, parents are working long hours just to keep a roof over their heads,” she said. “It’s not a lack of love, it’s a lack of time. So children seek comfort in technology that’s always there. But without guidance, that constant, unfiltered validation can pull them away from the very relationships that build resilience, Wise Mind decision-making, and emotional balance.”

What’s Wrong with AI Mental Health Support

Psychologists from Stanford and Common Sense Media spent four months testing the major AI chatbots. What they found was pretty alarming. These systems completely missed warning signs of serious mental health problems that affect one in five young people, including anxiety, depression, ADHD, eating disorders, mania, and psychosis.

The chatbots would get sidetracked. They’d downplay serious issues. In the kind of extended conversations that mirror how teens actually use these tools, the AI would reinforce harmful thinking instead of saying “Hey, you need to talk to an actual human about this.”

One in three teens now says talking to AI feels as good as, or better than, talking to real people. Read that again. We’re raising a generation that is learning to prefer artificial relationships over real ones.

What We’re Seeing in Houston

At Legacy Community Health, we’re watching this play out in real time. Guzman says that what concerns her most is how many of Houston’s young people are losing their tolerance for the natural discomfort that comes with real relationships.

“AI gives instant answers, instant soothing, and instant validation, and that can feel safer than navigating friendships, family stress, or school conflict,” she said. “But our kids need practice in those moments. They need the interpersonal skills: how to express a need, how to repair after a disagreement, how to stay grounded when emotions run high.”

She said they also need spaces that regulate their nervous systems: time outdoors, time to breathe, time to disconnect from constant digital stimulation.

“For our youth, who often carry both cultural and economic stress, developing mindful connection with caring adults and peers can be deeply stabilizing,” she said. “Technology can play a supportive role, but it can’t replace the embodied experiences that teach kids how to be in community.”

The American Psychological Association put out a health advisory warning that using AI chatbots for mental health support can actually make things worse. These systems don’t have what kids really need: human connection, professional assessment, actual therapy, coordinated care, and real crisis intervention.

Parents feel helpless. Their kid insists the AI “gets them” better than family or friends do. But the AI doesn’t “get” anything. It recognizes patterns in data. It doesn’t care about your kid. It can’t tell when being agreeable becomes dangerous.

What You Can Do as a Parent

  • Start talking about these issues early, before AI becomes woven into your kid’s daily routine. Ask them what apps they’re using. Show real curiosity, not judgment.
  • Keep an eye on how much time they’re spending on AI platforms, especially if they’re choosing digital interactions over hanging out with actual people. You don’t need to be invasive about it, but you should know what’s going on.
  • Let your kid be bored sometimes. Kids need unstructured time to figure out how to entertain themselves and how to reach out to people without a device feeding them what to say.
  • And look at your own phone habits. If you’re glued to your screen, that’s the message they’re absorbing.

There’s a Better Way Forward

Kids with social anxiety actually get better through gradual, guided exposure to real social situations, not by hiding behind algorithms designed to keep them engaged no matter what.

“When families reach out early, we see kids rediscover their own strength,” said Guzman. “They begin reconnecting with their parents, with their cultural identity, and with the world outside their screens. And parents play a powerful role in this.”

Guzman says you don’t need to be a tech expert. You just need to be willing to take the first step.

“At home, that can look like setting clear screen-time limits, shutting down the Wi-Fi at night, or unlocking your child’s phone to make sure they’re not being exposed to harmful or inappropriate content,” she said. “Early boundaries might feel shocking to kids at first, but an early digital detox can change a family’s dynamics forever.”

She notes that technology itself isn’t the enemy. When used mindfully and intentionally, it can bring families closer: sharing photos and videos from your day, keeping a family journal on a shared device, creating digital albums, planning activities with friends, or engaging with community groups online: “What matters is using tech in a way that enhances connection rather than replacing it.”

Mindfulness practices and nature-informed experiences give children the grounding they need to slow down, regulate, and relate to others with compassion.

“At Legacy, we help families build these practices so kids feel seen, supported, and anchored in relationships,” said Guzman.

If your teen is spending serious time talking to AI companions for emotional support, reach out. Real connection requires real people. We can help you get ahead of this.

If you or someone you know is thinking about suicide, help is available right now. Call or text 988 to reach the Suicide and Crisis Lifeline.