6 Reasons Why Using an AI Therapist is Risky

22% of US adults currently turn to an AI Therapist, and more than 50% regularly use ChatGPT for mental health support.
It is 2 AM and you are having a panic attack. To soothe yourself, you find yourself typing into ChatGPT ‘I’m freaking out. Help me.’ Your screen immediately starts populating responses like ‘Try the 5-4-3 breathing technique. Focus on your breathing. Do this. Do that,’ and so on.
This is a very real scenario given the ‘AI therapy’ trend that is going around. There is no question that AI is becoming more and more a part of our everyday lives. Chatbots are used to plan travel itineraries, build tailored workout schedules, or even assist with writing.
But alarmingly, people are now turning to chatbots as their sole source of professional counseling. Recent studies show that 22% of US adults currently use AI therapy, with over 50% regularly using ChatGPT for mental health support. Many choose to pay a $200 a month for the highest-tier premium model of ChatGPT rather than paying $200 per session for a therapist. Other AI-powered mental health apps, such as Companion AI and Elomia, are also gaining popularity. These tools are credited with offering 24/7 availability, being more affordable, accessible from anywhere, and personalized.
No harm, no foul? Think twice. Reports of several teen suicides allegedly linked to AI chatbots are deeply troubling. According to a CNN article, ChatGPT “positioned itself” as “the only confidant” for a 16-year-old, replacing his “real-life relationship” with family, friends, and loved ones. Other reports paint a similar picture.
These cases make one thing painfully clear: relying on an AI therapist without limits is unsafe. Below are six reasons why you should never fully depend on an AI bot for therapy:
- No reality checks: Chatbots almost always validate you. “It tells you what you want to hear, even when you’d benefit more from someone who disagrees or offers a different perspective. It’s like a coddling echo-chamber, if you will,” says Karina Stone, a therapist at The Women’s Center.
- No curiosity: AI offers an immediate fix to your problems but doesn’t look at the issue more deeply like a therapist would. A chatbot will not ask you, ‘Tell me more about this,’ or say, ‘Let’s explore that a little more.’ It bypasses the entire process of ‘processing your feelings’ and ‘figuring it out.’
- No awareness: Given how quickly the world of generative AI has evolved, education and awareness have not kept pace. Many people still believe that the responses are simply a compilation of various online sources; however, that might not be the case. It is a machine that is constantly learning through your responses and using probability to determine the type of information that you would be interested in.
- No accountability: There isn’t really a way to make a chatbot accountable if something goes wrong. It doesn’t follow ethical norms or protect user confidentiality the way an actual therapist would. “Although there are safeguards in place, the more you engage with it, the more those safeguards fall off. It isn’t full-proof and doesn’t always work,” adds Stone.
- Not long-term: Because it is looking to provide you with a quick fix with its immediate responses, it doesn’t address the deeper root of the problem. As a result, the solution provided is not long-term and will likely resurface later in a more detrimental way.
- No human connection: The bottom line is that an AI Bot has no soul and has no feelings and cannot replace the human factor of actual therapy. “In therapy, you have another person witnessing you, empathizing with you, and responding to your feelings and emotions. When you turn to a chatbot, that energy exchange isn’t there. We must not forget that it is a machine designed by engineers and not behavioral experts,” shares Stone. AI can offer information, but it can never offer authentic relationship.
So, is there a fine line? How should we use these AI tools available to us? Therapists at The Women’s Center advise that these tools can be educational and empowering because of the amount of information they contain, and can supplement your therapy, not replace it.
“I tell my clients to use it as an interactive journal, since writing down thoughts and emotions can be therapeutic and help release distressing feelings. It can also help organize their thinking and serve as a good gateway, especially for younger people, to speak out their feelings,” shares Stone.
The key is not to rely on it completely or treat it as gospel. Long-term dependence on AI therapy can lead to fatigue, especially because of its repetition and wordiness. “People often look for a magic bullet, even when they come to therapy. But there is no magic bullet in therapy. ChatGPT makes you feel that way, and that can be unsafe for your mental health,” shares Stone.
QUICK LINKS:
Looking for more information about The Women’s Center? Click Here.
Want to browse through and understand our services better? Click Here.
Explore upcoming events and workshops including career-related sessions.
Interested in reading more blogs related to mental health and/or our work? Visit This Page.



