Just last week, a client told me she woke up in the middle of the night with her mind racing. Instead of spiraling for hours, she opened ChatGPT, something we’d already talked about in session, and started interactive journaling. Within 15 minutes, she was falling back to sleep. That’s not a replacement for therapy. That’s a real-time application of therapeutic tools, made accessible when she needed them most.
The Genie Is Out of the Bottle
In the debate over whether artificial intelligence (AI) should be used for Mental health, the reality is that people are already using it. Our role as therapists isn’t to argue with reality. Our job is to be present with it, even if it makes us uncomfortable.
A 2024 study in JMIR Mental health showed 28% of people were using AI for Mental health support, while a study earlier this year in Harvard Business Review demonstrated therapy/companionship is the number one reason people use AI. People are using AI chat tools for journaling, emotional regulation, and real-time emotional support.
That doesn’t mean AI is ready to replace us. It isn’t. But it does mean that clients are already integrating these tools into their emotional lives, whether we like it or not.
Beyond the Headlines
When therapists hear ‘AI,’ the image can feel abstract or ominous. But in practice, we’re often talking about large language models, like ChatGPT or Claude, that mimic conversation and can be surprisingly useful for emotional reflection.
Media coverage often paints AI for Mental health as reckless or dangerous. And yes, there are real risks:
- Privacy and data security: Who owns the conversation? Where is it stored?
- Misinformation: AI can generate inaccurate or even harmful responses.
- Over-reliance: Some clients may begin substituting AI for human connection.
These concerns are valid. While we’ve always known that poor therapists harm clients, we continue to believe that with skillful use, therapy can be transformative. The same is true with AI.
The issue is not whether risks exist, but whether we are willing to understand the benefits alongside the limitations. Clients aren’t waiting for it to be risk-free. They’re already experimenting without guidance.
AI: Pros and Cons
I’ve been using AI for a year now and I’ve been incredibly impressed with how AI has helped me in my own growth over these 12 months. I would hate to see clients miss out on the potential for growth because therapists don’t understand how AI can be used effectively.
I’ve seen AI help with:
- Reflection on emotional patterns
- Clarification of inner conflicts
- Grounding exercises and nervous system regulation
- Practicing new perspectives or conversations in real time
What AI cannot do is:
- Provide crisis support
- Hold context over time
- Understand emotional subtleties
- Offer genuine relational attunement
- Guide ethical use
That’s where we come in.
The Role for Therapists
Clients often use AI because therapy wasn’t accessible or they didn’t feel heard in the past. Many aren’t choosing between therapy and AI. They’re using AI because they don’t feel like they have another choice.
As therapists, we can:
- Ask (without judgment) if clients are using AI.
- Reinforce insights while flagging potential distortions.
- Offer containment and safety when AI can’t.
- Help clients integrate what emerges into their broader healing process.
- Teach clients the advantages and disadvantages of AI for Mental health and emotional support.
Get to Know AI for Yourself
Before helping clients reconsider their use of AI, therapists need to experience it themselves. Bring either a real-life concern, or a purely hypothetical one, and take it to an AI chatbot like ChatGPT or Claude.
Curiosity doesn’t mean endorsement, but it does mean meeting our clients where they’re at.
Two things to pay attention to:
- How you interact. Do you find yourself projecting onto it? People-pleasing? Hesitating to be direct or assertive?
- How it impacts you. Notice its responses and how they land. Did it help you think more deeply? Did you feel a flicker of connection, even knowing it isn’t human?
It might feel strange at first, but that’s part of the value. By noticing your own reactions, you’ll better understand how your clients may be experiencing AI, and you’ll be more prepared to help them use it safely and constructively.
We don’t yet know what an AI-therapy hybrid will ultimately look like. But we do know this: our clients are already experimenting with it. If we refuse to engage, we leave them unsupported in what has always been our area of expertise.
The genie is already out of the bottle and we can’t wish it back in. If we act wisely, AI won’t replace therapy. It will expand what healing can look like.