AI for mental health support: 2025 info sheet

Published on
Updated on
Use a read speaker Print a document

Artificial intelligence (AI) is changing how young people across Canada learn, work, play, share and more. It can process information quickly, often in ways that feel personal and helpful. But there are some things to keep in mind when using AI for mental health support. This resource from Kids Help Phone (KHP) explores what AI is, how you might use it, safety considerations and ways you can access non-judgmental real human help for your feelings. 

What is AI?

In short, AI is technology that uses data to complete complex tasks efficiently. Some examples include ChatGPT, Siri, customer service chatbots, music recommendation systems, grammar checkers, spam filters and self-driving cars. To learn more about AI, you can tap to explore the following resources: 

How do people use AI?

Some people use AI to: 

  • learn and research new things 
  • brainstorm ideas or organize information 
  • summarize content or take notes 
  • automate repetitive tasks 

Are young people using AI for mental health support?

2025 research by KHP exploring how youth in Canada engage with mental health-related content online found that, among participating young people: 

  • 69% said mental health-related content on AI tools such as ChatGPT was helpful. 
  • About 17% had used ChatGPT for mental health advice. 
  • Many said they like these tools because responses are fast, feel personal, and feel safe and judgment-free. 

What are some things to consider if I use AI for mental health support?

AI can be helpful, but it’s not human. Like any digital space (e.g. social media, news channels, etc.), there are online safety tips to keep in mind.  

You might start by reflecting on the facts, for example:

  • AI doesn’t have thoughts / feelings of its own, including empathy (even if it sounds like it cares).  
  • AI is usually trained using publicly available information, so it doesn’t always have access to the latest research / data. 
  • AI’s responses aren’t reviewed by real people such as trained experts, doctors or helping professionals. 
  • AI doesn’t have human experiences, so it might not correctly interpret the language you use, your specific situation, your personal / family history, etc. 
  • AI’s responses can include bias, ads and incorrect / misleading information. 
  • AI isn’t regulated in Canada (there are no specific AI laws yet). 
  • Right now, there aren’t any clear standards to help AI tools put safety and well-being first. 

How can I take care of myself if I use AI for mental health support?

If you use AI for mental health support, you can prioritize your safety and well-being by asking yourself the following questions:  

  • What do I want to get from using AI? 
  • Is what I share private? (Tip: You can check the privacy policy of the tool you’re using.)  
  • Where does the information I get come from? 
  • How will I confirm if the information I get is right (fact check)? 
  • Does the tool I’m using / the information I get from it match my values? 
  • How am I feeling about using AI? 
  • What will I do if I notice upsetting content? 
  • What are all my options for accessing support? (Tip: You can learn more about support options at KHP by scrolling down on this web page.) 
  • What boundaries do I want to set for myself and AI (e.g. how often I use it, what I use it for, etc.)? 
  • Does AI correctly interpret the words I use (and what I actually mean)? How often does this seem to happen — or not happen? 
  • How can I tell if content is AI-generated or real? For example, how can I find out if an image is a deepfake or an actual photo? 

When might I connect with a real human for support?

Help means everything. It means AI can give you quick information. It also means a real person can help you navigate your thoughts, feelings and experiences. Ultimately, it’s about accessing support in whatever ways make sense for you.  

You might choose to contact a real human you can trust if you’re seeking: 

  • urgent help in unsafe situations in unsafe situations 
  • ways to cope with thoughts of suicide 
  • support that recognizes your unique experiences 
  • help with a safety plan 
  • connection, culture and / or community 
  • ongoing relationships and / or in-person support 
  • space to discuss / explore your thoughts, feelings and actions 
75%

On average, 75% of young people contacting KHP share something they’ve never told anyone before.

How can I contact a real person for help?

KHP offers help in all sizes by pairing technology (including AI) alongside real human support to give young people’s feelings a place to go. Youth across Canada can get free, confidential, multilingual help any time, day or night, at KHP.  

You can tap to learn more about the following programs, services and resources at KHP: 

You might also contact an adult you trust such as a parent / caregiver, teacher, Elder, health-care professional or social worker, or a trusted friend or classmate. 

Whether through real human connection and / or AI for mental health support, you can get help for any feeling, big or small, in the ways that feel best for you. 

Resource Feedback
Was this page helpful to you?
Did you learn anything from this page that you can use in your life?
Did you get the support you were looking for today from Kids Help Phone?