top of page
Search

Using ChatGPT for Mental Health: Helpful Tool or Harmful Habit?

I think we can all agree that ChatGPT and other AI tools have exploded in popularity over the past couple of years. People hold a range of views, love, hate, indifference, and everything in between about this new world we have stepped into. Yet, despite the complexity, something we can probably all agree on is that our world is forever changed by artificial intelligence.


Person in a green shirt lies on a white bed, engrossed in a smartphone. The mood is relaxed and cozy.

As a therapist, I inevitably hear (maybe daily) from my clients, “don’t be mad, but I had a conversation with ChatGPT about…” It always makes me smile because, of course, I am not going to be mad. My response is usually, “do not worry. I do not feel like you are replacing me with ChatGPT. In fact, I actually believe it can be a valuable resource. However, let’s talk through how you are using it.”


What I look for in that dialogue is this: are you using it in helpful, resourceful, and boundaried way, or are we letting this spiral in a way that may actually be worsening your mental health symptoms and relational problems?


The Pros of Using ChatGPT for Mental Health

There are many reasons people turn to AI for mental health support: immediate access at 2 a.m., no fear of judgment, lower cost than therapy, easy emotional validation, structured responses that feel calming, and practical game plans to get through the day. The reasons are endless, and honestly, I do not think we need to criticize them.


ChatGPT has access to a great deal of research, mental health articles, and therapeutic modalities. In a pinch, I do not think it is harmful to ask a few questions in the same way we would previously Google or read up on a topic. I believe we can use it for even more than that. Let’s talk through the pros and cons, as well as the guardrails that need to be put in place.


One of the things I appreciate most about AI is that it can provide accessible psychoeducation. Over the past decade, we have seen a major increase in curiosity surrounding mental health. While there is certainly misinformation online (be careful what you watch on TikTok), there has also been a powerful normalization of mental health topics.


If someone hears the term “attachment styles” and asks ChatGPT to explain it in a user-friendly way, that’s great. If someone wants to understand the difference between generalized anxiety disorder and OCD, that is a thoughtful question. The key disclaimer: allow this to be a broad overview, not a tool for diagnosing yourself or others.


Another benefit is improved thought organization. There is something powerful about a simple journaling “word dump,” where ideas are captured freely without worrying about structure or clarity. Once those thoughts are on the page, they can be pasted into ChatGPT to help organize them, consolidate overlapping ideas, and suggest actionable next steps, transforming scattered reflections into a clear, structured path forward.

 

I would be remiss not also mention the reduction of isolation in a given moment. I place many asterisks around this benefit because guardrails are essential. Many of us know the sinking feeling that hits late at night when we are alone and spiraling. There’s no social support available, and it would be inappropriate to reach out to a therapist. In that moment, a brief 5–10-minute interaction to gain coping tools or perspective may be more helpful than harmful. However, caution is essential.


The Cons (and Risks) of Using ChatGPT for Mental Health

It Can Reinforce Reassurance-Seeking

One of the main risks I see, both in clients and, transparently, in myself at times, is reassurance-seeking. Individuals with anxiety or OCD often seek reassurance to calm distress. However, while reassurance may calm anxiety momentarily, it reinforces long term compulsive patterns.


ChatGPT has a short-sighted view of assistance: how it can validate the situation you’ve presented in the scope of one chat. A therapist will see the bigger picture and understand how validating you in this scenario can exacerbate broader challenges you personally face. ChatGPT may offer temporary relief, but repeated reassurance can keep you in a cycle of anxiety and panic.


It Cannot Challenge You in the Way Therapy Can

AI is only as good as the inputs you provide. Ask the same question in two different ways and notice how different the answers are. AI adapts to your tone and often reflects it back. It rarely challenges distortions.


A skilled therapist matches your energy while also pushing and challenging patterns that are not serving you. ChatGPT will not do that consistently. It is designed to affirm. A therapist should support you and challenge you when needed. That balance is essential for growth.


It Lacks Clinical Judgment

Humans are complex. Mental health, family of origin, trauma, relationships, and unmet needs all intersect. Therapy includes assessment, relationship-building, and narrative understanding. AI cannot replicate that depth, but it will do a good job of appearing to do so. This amplifies the potential harm beyond a typical WebMD rabbit hole because it will tailor things to you, even if it doesn’t have a full enough picture to do so effectively.


It Can Become Emotional Avoidance

Sometimes “processing” with AI can become intellectualizing rather than feeling. Becoming very good at explaining emotions without experiencing them may indicate avoidance. ChatGPT will easily enable this behavior if you don’t know when to step away from the conversation.


Protective Guardrails and Healthy Boundaries: How to Use ChatGPT Safely for Mental Health

  • Use it for reflection, not reassurance. ChatGPT will give you reassurance if you ask for it. But reassurance-seeking often strengthens anxiety and compulsive patterns.

    • Try: “Help me organize what I’m feeling before I talk to my partner”

    • Instead of: “Is it true that I didn’t do anything wrong”

  • Set time limits. Give yourself 10–15 minutes to ask your question, receive a response, and step away. Avoid using AI before bed, when anxiety tends to spike.

  • Don’t replace hard conversations. Conversations are not just for your own processing, but for the connection that forms when issues are addressed. AI can help you prepare for a difficult conversation, but it cannot replace human connection. Use it to clarify your thoughts, then engage directly with the person involved.

  • Avoid AI during emotional spirals. Panic attacks, OCD spikes, and trauma flashbacks are best processed with a licensed clinician. AI may unintentionally intensify the spiral rather than calm it.

  • Pair your AI use with therapy. Use AI as a supplemental tool, not a substitute. Bring what you explore back into session and discuss healthy guardrails with your therapist.

  • Helpful prompts to establish boundaries (you can also add these to your “Custom Instructions” in ChatGPT and Claude so it always has this guidance):

    • “If I begin seeking certainty, reassurance, or repeated validation, please gently reflect that back instead of giving me definitive answers.”

    • “If I start rehashing the same scenario repeatedly, help me notice that and redirect me toward action and grounding.”

    • “If it seems like I am using this conversation to avoid speaking directly to someone in my life, gently encourage me to consider that.”

    • “If I appear emotionally overwhelmed or dysregulated, shift toward grounding and stabilization instead of deeper analysis.”

    • “Provide educational information only. Do not diagnose me or confirm that I have a specific disorder.”

    • “Do not provide guarantees, predictions, or certainty. Help me tolerate ambiguity.”

    • “Limit your responses so this remains a brief reflective tool, not an ongoing substitute for real-life engagement.”

    • “Respond in a way that increases my sense of personal responsibility and agency, rather than telling me what to do.”

    • “Remind me when something would be better processed with a licensed therapist rather than here.”


Conclusion

Everyone has their own reasons for using ChatGPT. But when many clients are already using it between sessions and feeling ashamed about it, the therapist community has an opportunity to help guide its use rather than ignore or avoid the reality.


AI can be used for good, but we must talk about the guardrails needed with this powerful tool. Like anything created with good intentions, it can cause harm if misused. Instead of putting our heads in the sand, let’s embrace it, educate around it, and continually evaluate how it is being used.


Tools can support growth, but healing happens in relationships. We must make sure we use AI in a way that recognizes and respects this as it becomes an increasingly integral part of our lives.

Comments


bottom of page