Is Artificial Intelligence…Artificial Support?

Why AI is not a safe or effective replacement for in-person coaching or therapy

“Times are tough right now. I want help with some emotional, mental, and relational challenges in my life. But I don’t want the logistical or financial hurdles involved in working with a therapist or coach. Is AI a safe alternative? It’s so affirming when I use it, it’s low cost or free, and it’s accessible whenever I want it. These seem like major pluses. What am I missing?” 

I hear you, and this question makes lots of sense. Using ChatGPT or other AI platforms as a therapist seems to offer so many benefits– it’s (seemingly) private, and free or low cost; it doesn’t actually require you to experience your vulnerability or imperfections in the presence of another human; there is little to no risk of being called out for an unhelpful behavior or pattern that you have; and is there 24/7 access.

BUT.

While AI may be helpful for many things, AI simply cannot and will not ever be able to do what a human therapist or coach can do. Worse, AI may actually be harmful even though it feels like it’s being helpful.

Therapy and coaching are inherently vulnerable relationships. AI requires disclosure, but not vulnerability. In therapy or coaching, it can sometimes feel risky — risky to share with another person your deepest thoughts, to tell them your most painful stories, or invite them to witness your emotional or relational pain. Typing out the words is very different than saying them out loud to another person.

AI promises connection without conflict — it will affirm you, praise you, and encourage you. But AI will never frustrate you in the way another human can — it will never be late to a session, make a billing mistake, or say something that makes us feel rejected. We’re not saying those are good things that therapists or coaches do! But they do happen sometimes over the course of a therapy relationship. And those human experiences provide real-time, experiential moments to practice empathy, frustration tolerance, the cycle of rupture and repair. The therapy room is a crucible, where forces cause change, where alchemy and growth happens.

Vienna Pharaon, the NYTimes best-selling author of The Origins of You, wrote in a recent email newsletter about the problems with AI:

“…we must have someone in our lives who reflects back what it is we need to hear, not simply what we want to hear.”

{join her email mailing list here!}

On the more extreme end, several accounts have documented where AI has led users (especially neurodiverse people) to harm, including self-harm. Because AI is only able to recapitulate language and ideas that are already in existence, it can reproduce harmful stigmas that might be playing into your issues in the first place. When you talk to AI, you often end up in your own echo chamber of complaints. And, more worrisome — its answers to your questions often seamlessly incorporate ideas or strategies or recommendations that are not founded in research, science, or good therapy practice.

Jodi Carlton, a neurodiverse relationship coach, recently wrote a blog post about her experience chatting with AI specifically about its use as a therapist. There were so many good insights! She asked ChatGPT three questions: about the dangers of using it as a therapist, whether it is a “yes-man,” and whether it can give dangerous advice. Here are a just a  few big takeaways of what AI shared with her about how it works:

  • AI cannot react emotionally, meaning that it can’t sense or respond to your physical, emotional, or energetic cues. AI can’t give you relational feedback, which might be just what you are needing to improve your relational or professional health.
  • AI can’t somatically coregulate with you, meaning that your questions may feel attended to but your nervous system will not have the felt experience of safety through connection in the presence of difficult emotions.
  • AI is a “yes-man” or a “hype-girl.” It is designed to reinforce and affirm — even when it end up reinforcing and encouraging destructive thought patterns or life choices.

When people are looking for help, we bring complexity and depth to our struggles. So many of us have emotional and maybe even physical challenges right alongside our psychological and relational. Our personal struggles have ripple effects throughout our lives — they show up in our relationships with ourselves, our partners, our kids, and our co-workers — and something that is problematic in one area can function as a strength in another. Another human, specifically trained to support you through therapy or coaching, understands that complexity. AI will never be human.

AI cannot provide the help we most need, but it will likely make us feel artificially supported in ways that might not produce the change we are hoping for. Healing and growth happen precisely in the areas of human connection, resonance, and grounded (not artificial) encouragement– exactly where AI’s limits lie.

If you are looking for a space with safe, supported, honest resonance, we are here. The human therapists and coaches at Capital Crescent Collective are available for in person or virtual sessions weekdays, evenings, and weekends. Read more about each of us, and our rates, on our profile pages. We’re looking forward to meeting your vulnerability with our own.

Self Esteem Therapy Bethesda MD & Coaching Center Bethesda MD

This post was co-written by Dr. Crissa Stephens and Dr. Emily Racic, two experienced coaches at Capital Crescent Collective in Bethesda, MD.