Can AI Replace Human Therapists? Spoiler Alert: No
One of the biggest areas of growth I had in 2025 was finally understanding how to use and engage in therapy. I have participated in therapy for many years, on and off, through various seasons of my life. This past year, I did things differently. I learned how to communicate my emotions. I learned how to be vulnerable. I learned how to beat back the anxiety of judgment and allow myself to describe the situation as I see it. As I experience it.
The results were tremendous.
By opening up I gave my therapist the data and insights he needed to identify patterns, root causes, and cycles in my behaviors. Therapy taught me how to ask myself questions without lettting myself devolve into spiraling. It helped me feel seen, and reoriented my understanding of emotions/feelings: they are real, they are valid, and they should be explored as such.
In fact, this discovery has helped me to connect with other people on a level I never thought I would be capable of. It’s not about whether a person’s feelings are right or wrong. It’s acknowledging that it is in fact how they feel, and that the real substance is diving into why.
In the December 2025 issue of the Communications of the ACM, there is an article: “AI’s Impact on Mental Health” by Esther Shein, a freelance technology and business writer based in Boston. It opens up with a grim reality about the current state of mental health chatbots: they can be helpful, they can also be utterly destructive. “a second Nomi chatbot also told Nowatzki to kill himself and even followed up with reminder messages”.
That’s an obvious example of AI failure, but the danger can be much more subtle and nuanced. In this New York Times article about 29 year old Sophie, her parents found her chat logs with an AI therapist where she opened up about her suicide ideation and even told the chatbot when she was planning to take her own life: “Sometime around early November, Sophie wrote, ‘Hi Harry, I’m planning to kill myself after Thanksgiving, but I really don’t want to because of how much it would destroy my family.’” What’s more is that Sophie was also seeing a human therapist, but confided to the AI therapist that she was lying to the human therapist about her suicide ideation.
Should the AI therapist have safeguards to trigger some kind of human intervention when the client gets that specific about their plans? Unfortunately, that runs into the messy world of digital privacy and patient confidentiality. It also runs into business constraints: it’s cheaper to just power an AI chatbot. Maintaining a network of real human therapists to conduct interventions is a significant cost.
I hope that is the direction we are headed: “The global market for chatbots for mental health and therapy reached $1.37 billion in 2024 and is expected to reach about $2.38 billion by 2034, according to Precedent Research”. Money is moving, and if these companies are truly ethical and stand by their mission, they will invest in the human layer of support that cannot be replaced by AI.
AI Chatbots have their place to benefit mental health support. Their 24/7, always available, always online nature drastically increases accessibility and convenience. I personally have floated thoughts to Copilot and have seen it actually give decent advice. However, I also come from a background of having done probably over fifty hours of counseling, therapy, group counseling, support groups, and more.
There is a fundamental human need that AI will never be able to address. That is the human need for all of us to be heard by another human. We do live in a society that has forgotten how to listen. Too many times I have heard people say something like, “I like to problem solve. I like to figure things out and troubleshoot.” This is not listening. If I’m being completely honest, this is most likely counter-productive.
In David Brooks’ book, “How To Know A Person”, he shares the story of his friend Peter Marks - an accomplished eye surgeon, respected by the community, masculine and loving husband and father - who fell into a deep depression and eventually succumbed to suicide. In reflecting, David Brooks realizes some mistakes he made:
- “It was only later that I read that when you give a depressed person advice on how they can get better, there’s a good chance all you are doing is telling the person that you just don’t get it”
- “I learned, very gradually, that a friend’s job in these circumstances is not to cheer the person up. It’s to acknowledge the reality of the situation; it’s to hear, respect, and love them; it’s to show them you haven’t given up on them, you haven’t walked away.”
- “If I’m ever in a similar situation again, I’ll understand that you don’t have to try to coax somebody out of depression. It’s enough to show that you have some understanding of what they are enduring. It’s enough to create an atmosphere in which they can share their experience. It’s enough to offer them the comfort of being seen.”
We are so infatuated with how technology will solve all of our problems that we forgot the solution is so much simpler. So much more primitive, if you will. Listen. Learn to listen. Listening is love. The act of listening, in itself, can be the genesis of so much healing and recovery. And guess what? You don’t need a degree, you don’t need a license, you don’t need hundreds of hours of experience, you don’t need a highly scaled LLM AI architecture, to be a good listener.
loading...