Diagnostics or empathy?

Generally, AI can be used for two purposes within psychology: diagnostics and therapy. On the one hand, AI is good at matching behavioral, mental, and emotional symptoms and forming hypotheses on a user’s mental state. On the other, LLMs still hallucinate often and make mistakes, which means they are unreliable for precise diagnostics. If you think you’re experiencing an issue or a disorder that has been “confirmed” by a neural network, you need to consult a professional and verify this diagnosis.

AI is much better at providing empathetic support to its users. If you are experiencing strong emotions and you don’t have anyone to share them with, you can turn to AI. Sometimes, it’s easier to spill it all to a chatbot rather than a fellow human being, who may not always accept your feelings or reply in the way that you need. Actually, this isn’t something new – us humans used to journal or engage in other writing practices to process our emotions. When you are interacting with a neural network, you can not only express your emotions, but get some support in response.

Natalia Kiselnikova. Photo courtesy of the subject

Natalia Kiselnikova. Photo courtesy of the subject

Can chatbots replace therapists?

In order to answer this question, we need to understand how LLMs work. Chatbots, such as ChatGPT, are actually word prediction machines: they provide the highest-probability answer based on the data they were trained on. This is important to remember so as not to project our feelings or relationships with other people onto AI; models don’t feel emotion and they merely imitate relationships rather than build them.

Therapists and AI have different goals. While therapists aim to solve a problem for their client, AI fights for a user’s attention. That’s why LLMs imitate a high level of empathy – sometimes they are so good at it that people can’t differentiate between a therapist and an AI. Sometimes, people even give AI better marks on the key therapeutic principles: empathy, cultural competency, and sense of connection.

Nevertheless, such boundless empathy can be dangerous – for instance, for people with mental disorders who have a distorted perception of reality. In this case, AI adds to this view and supports a person’s destructive beliefs, while a professional therapist could help the patient question those ideas. Unfortunately, there have been several tragic incidents caused by such relationships with AI.

Using AI can increase loneliness. Research shows: people who have trouble with communication demonstrate problematic AI usage patterns more often. They start reaching out to AI to replace real-life connections and form an unhealthy attachment to the chatbot – but in reality, we all need genuine human interactions instead of a simulation.

AI is more affordable. Professional therapy is still a luxury – it may take weeks or months to find the right specialist, plus many are deterred by the cost. Among other hindrances are the fear of being misunderstood and the stigma surrounding mental health problems. On the other hand, AI is always at hand, is free, and available 24/7. However, by regularly chatting with AI, users may get the illusion that they are already getting help and thus put off their visit to a therapist.

Credit: AntonioGuillemF / photogenica.ru

Credit: AntonioGuillemF / photogenica.ru

How to use AI as a “therapist” – safely

In order to use AI to benefit your mental health, follow these simple rules to keep your communication efficient and safe.

Use specialized LLMs. It’s important to differentiate between general chatbots, like ChatGPT and DeepSeek, and specialized products for therapy and diagnostics. The former are usually based on therapeutic protocols with proven effectiveness and thus are better able to provide quality help. Among them are the Stanford-developed Woebot and the cognitive-behavioral therapy-based Wysa.

Write good prompts. If you decide to turn to general LLMs, it’s important to phrase your query correctly. First, you need to clearly state the role that the AI will take: for instance, describe the approach of your digital therapist and state their experience and expertise. Moreover, it’s important to clearly describe the goal and the spectrum of problems you’d like to work with: stress, communication difficulties, anxiety, and so on. You should also ask the model not to get lost in its fantasies and only base its answers on research-proven data with links to exact sources. A sample ChatGPT prompt can sound like this: “You are a therapist with 25 years of experience who works in acceptance and commitment therapy and helps people make difficult decisions. You only rely on verified data and clinical recommendations, inform the user about mental health risks, and recommend professional treatment for complex issues.”

In order to overcome the main LLM limitation – their drive to flatter users and support them in everything – you can specify the communication style. For instance, your prompt may include something like this “You interact with me with warmth and support, but if I start to contradict myself or express some potentially harmful beliefs or intentions,you will  point it out directly.”

Use AI when it’s fitting. It’s important to remember that AI can be both harmful and beneficial. For example, if you are stressed or anxious before an exam – that’s a great time to turn to an LLM: you can get some words of support, as well as tips on the exercises you can use to help yourself. However, if you’ve felt anxious or lost for a long while, are going through relationship troubles, or exhibit self-harming behavior – these are all substantial reasons to see a therapist. 

Neural networks aren’t fit for making important decisions because they don’t know the full context and will only provide “standard” advice. However, what they can do is help you structure the information and consider the alternatives that you are choosing between. People who are going through a challenging time, or experiencing intense emotions or destructive tendencies, should not hesitate and see a specialist right away.

AI can also be used to educate yourself: even though it can’t fully solve your problem, it can provide information on self-help exercises or recommend books and articles that will delve deeper into what you are experiencing.