logo
#

Latest news with #AkulGupta

'ChatGPT Isn't Your Therapist': The Mental Health Cost of AI Therapy
'ChatGPT Isn't Your Therapist': The Mental Health Cost of AI Therapy

Business Standard

time18-07-2025

  • Health
  • Business Standard

'ChatGPT Isn't Your Therapist': The Mental Health Cost of AI Therapy

The Growing Role of AI in Mental Health AI chatbots like ChatGPT are now everywhere—affordable, convenient, and often hailed as the next big leap in technology. Many industries have started replacing human professionals with chatbots, and mental health is no exception. Here's the problem though: ChatGPT is not a therapist. And when we confuse its friendly tone with actual therapy, we risk doing more harm than good. We spoke with the team at BetterPlace Health, a leading mental health clinic in Delhi, to understand what real therapy looks like, why AI isn't a replacement for therapy, and who's really at risk in this situation. Why ChatGPT Feels So Understanding Dr Akul Gupta, leading psychiatrist at BetterPlace Health shares from personal experience that ChatGPT is very good at imitating humans. It mimics patterns of human speech and mirrors your emotions to give you a soothing reply, making you feel heard and understood, which can be comforting when you're alone and feeling vulnerable. But here's the catch: it doesn't actually know you. It doesn't think the way humans think, and it certainly doesn't feel what you feel. ChatGPT is a language model trained on vast amounts of data to predict the kind of reply you'd want to hear. The words may sound comforting on the surface, but there's no real understanding behind them. It sets wrong expectations and misleads you into feeling understood, and is only saying what seems right in the moment. Risks of Replacing Real Therapy Ms Ayushi Paul, clinical psychologist at BetterPlace Health warns that relying on AI for mental health support can be misleading and, in some cases, harmful. You might follow its suggestions, believing you're making progress, but the AI doesn't actually know what progress should look like for you. It lacks the context, clinical judgement, and personal understanding needed to guide real progress. This can slow down your healing, leave important issues unaddressed, or even delay diagnosis for serious mental health issues. Ethical and Safety Concerns As Ms Lovleena, a clinical psychologist at BetterPlace, shares, AI tools aren't therapists. They haven't undergone clinical training, don't follow proper ethical guidelines, and can't deal with crisis situations. When someone is at risk of self-harm or experiencing a psychiatric emergency, a chatbot can't assess the severity, take action, or offer immediate care. At best, it may respond with generic advice, but it has no duty of care or accountability. Then there's the question of data. Everything you share with an AI is stored somewhere. These tools are designed to learn from what users type, which means your personal thoughts, emotions, and mental health concerns may be used to train future models. This raises serious concerns about privacy and consent: Who owns that data? Where is it stored? What happens if it's leaked or misused? AI in mental health currently lacks clear regulation, safety standards, and oversight, posing real risks to vulnerable users without proper safeguards in place. What You Can Do Instead AI is here to stay; that is clear. As technology evolves, so will AI, and it is an excellent tool when used with standard techniques. It can help you track your mood, give you reminders, or suggest coping strategies based on your situation. But it cannot replace the connection, insight, and responsibility that come with a real human being. A chatbot may sound understanding, but it doesn't know your story. It's not trained to guide a comprehensive treatment plan, and it isn't equipped to handle risks that come with mental health care. Comforting words are not the same as clinical support. So what can we do instead? Talk to a professional. Many therapists offer low-cost sessions, and some clinics use sliding-scale fees based on income. Use AI as a tool, not a therapist. Let it help with reminders, journaling prompts, or creating schedules — but do not use it as your main form of support. Explore community-based options like mental health NGOs, support groups, helplines, or therapy sessions from trainee psychologists offering free support. What Comes After This? AI may be the greatest invention of the modern era, and it certainly isn't going anywhere. As we learn to adapt to it and integrate it into our lives, it is important to know that we should only use it to support our treatment, not replace it. A few sessions with a psychiatrist or a psychologist will help you much more than any AI chatbot could. So yes, let's embrace innovation. But let's also protect what matters most: our emotional safety, our personal stories, and our right to be truly seen and supported. AI can be a helpful tool, but when it comes to your mental health, there's no substitute for a real human being.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store