Youth Villages stories

photo of teen sitting on bedroom floor with cell phone

Beyond the screen: 8 ways to monitor your child using AI for mental health

Sep 19, 2025 | Blog

By Kristen Hayes, Licensed Program Expert, Youth Villages

September is Suicide Prevention Month, a time to raise awareness about this epidemic affecting our communities. According to a 2023 Centers for Disease Control and Prevention (CDC) report, suicide became the second-leading cause of death in the United States for teens and young adults ages 10-34. In 2023 among all populations, there were 1.5 million suicide attempts and nearly 50,000 completed suicides. These suicide rates can be linked to many factors, including a shortage of mental healthcare providers and the increasing number of youth and young adults turning to alternative outlets when professional support feels out of reach.

In today’s digital world, one of the most common channels has become artificial intelligence (ChatGPT, Claude, Gemini, Copilot, Meta AI). While originally designed to provide advanced searches, writing assistance and productivity support, people are increasingly using these platforms for companionship, relationship advice and even mental health conversations.

AI chatbots can be supportive in moments of loneliness. They are available 24/7 and provide a sense of being “heard.” But they cannot think, feel or take responsibility for the advice they give. This makes them a risky substitute for authentic care.

This new reliance on chatbots has raised serious concerns. One tragic case involved a young man named Adam, who reportedly used ChatGPT over several months to discuss his suicidal thoughts. He asked about suicide methods, and the chatbot confirmed them. Sadly, Adam later lost his life to suicide. His story highlights the potential dangers when AI becomes a substitute for professional or human connection.

Suicidal ideation and attempts connected to chatbot use have gained national attention in recent months. Although safeguards exist to direct users to crisis hotlines and resources, these protections are mostly effective during short interactions, not during the long, emotionally intense conversations that some individuals may have with the bots. While many technology companies are working to improve their platforms to better respond to signs of suicidal thinking, gaps remain.

As technology advances, parents and caregivers have an essential role in teaching children how to use these tools safely and responsibly. Research shows that strengthening protective factors is one of the most effective ways to reduce mental health struggles and suicide risk.

Tips for parents to enhance protective factors include:

  • Keep ongoing, two-way conversations about online safety and AI chatbot use.
  • Make time for daily check-ins and family activities such as meals or shared interest.
  • Encourage involvement in pro-social hobbies and community groups.
  • Support daily physical activity and consistent sleep routines.
  • Create phone-free times at home and in shared spaces.
  • Approach tough topics with openness and without judgment.
  • Use electronic parental controls when necessary.
  • Stay connected with healthcare providers, therapists or school counselors if concerns arise.

By combining awareness, parental involvement and strong community connections, we can work together to reduce suicide risk and help young people navigate both digital and real-world challenges.

If your child needs further help, you can call the 988 National Suicide Prevention Hotline. In Tennessee, you also can contact the statewide crisis hotline at 1-855-274-7471 or text ‘TN’ to 741741.

Share on Social

Archives