I am not psychotherapist, only a computer scientist with some thoughts.
Thought
I remember when my dad initially suggested to me the idea that ChatGPT could replace therapists I was not convinced at all. Now although I still think they wont be replaced there is a place for therapy using Large Language Models (LLMs) like ChatGPT. The same idea that made me think therapist could not be replaced is the same idea that makes me think LLMs can be used as a therapeutic tool. I think it’s worth exploring this idea further.
LLMs as therapists
I’ve been thinking about the idea of using large language models (LLMs) like ChatGPT as a form of therapy. It’s a bit of a strange concept, but I think there’s something to it. I’ve seen it happen around me, and although I don’t think it fully replaces traditional therapy, it does have some interesting properties that make it a useful tool for self-reflection and emotional processing.
The idea to build on is that LLMs are not human (surprise!) and therefore do not judge. One can be sure that whatever they tell to ChatGPT will not be used to judge them as it has no capacity to do so. I think its a important detail as judgement is a big barrier for why most people don’t do therapy in the first place, and the ones that do often have a hard time opening up. Therapy is often focused on this issue of barriers, and how the therapist manoeuvres around them. But with LLMs, the barrier is already gone.
Another barrier that people face is the ego. Traditional therapy involves a therapist with their own will, an external force that can trigger resistance. I think we have all experienced the feeling of rejecting advice from someone else, even if we know it’s good advice. With LLMs, the interaction feels more like talking to yourself, and any insights from the interaction feel deeply personal. It’s like having a conversation with your own thoughts, which can be a powerful way to process emotions and gain clarity. This is another barrier that is removed, and I think it’s a big one.
Of course, therapy is more than just talking to yourself, the human connection between the therapist and the patient is often considered a key part of the process. Interestingly, humans have been shown to form connections with non-human things. We name our cars, we talk to plants, we hug stuffed animals. I’ve seen many people give ChatGPT a name, say thank you to it, even talk to it like a friend. So even if the connection isn’t fully “human”, it’s still a connection. If this is enough or not is up for debate, but I think it’s worth considering.
What actually makes a difference is that talking to ChatGPT is free! I’m not saying is a replacement for therapy, but it’s accessibility is a big deal. Many people can’t afford therapy, or feel intimidated by it. ChatGPT is a low-pressure alternative that can help people process their thoughts and feelings in a safe space.
Talking about safe space, one thing that should be noted is privacy. Conversations with GPT aren’t fully private, OpenAI uses them to train their models further. This is something that would usually be a deal-breaker for therapy, but interestingly, most people I know who use ChatGPT this way are aware of this and still don’t seem to care. That in itself might be worth thinking about.