logo
'ChatGPT Isn't Your Therapist': The Mental Health Cost of AI Therapy

'ChatGPT Isn't Your Therapist': The Mental Health Cost of AI Therapy

The Growing Role of AI in Mental Health
AI chatbots like ChatGPT are now everywhere—affordable, convenient, and often hailed as the next big leap in technology. Many industries have started replacing human professionals with chatbots, and mental health is no exception. Here's the problem though: ChatGPT is not a therapist. And when we confuse its friendly tone with actual therapy, we risk doing more harm than good.
We spoke with the team at BetterPlace Health, a leading mental health clinic in Delhi, to understand what real therapy looks like, why AI isn't a replacement for therapy, and who's really at risk in this situation.
Why ChatGPT Feels So Understanding
Dr Akul Gupta, leading psychiatrist at BetterPlace Health shares from personal experience that ChatGPT is very good at imitating humans. It mimics patterns of human speech and mirrors your emotions to give you a soothing reply, making you feel heard and understood, which can be comforting when you're alone and feeling vulnerable.
But here's the catch: it doesn't actually know you. It doesn't think the way humans think, and it certainly doesn't feel what you feel. ChatGPT is a language model trained on vast amounts of data to predict the kind of reply you'd want to hear. The words may sound comforting on the surface, but there's no real understanding behind them. It sets wrong expectations and misleads you into feeling understood, and is only saying what seems right in the moment.
Risks of Replacing Real Therapy
Ms Ayushi Paul, clinical psychologist at BetterPlace Health warns that relying on AI for mental health support can be misleading and, in some cases, harmful. You might follow its suggestions, believing you're making progress, but the AI doesn't actually know what progress should look like for you. It lacks the context, clinical judgement, and personal understanding needed to guide real progress. This can slow down your healing, leave important issues unaddressed, or even delay diagnosis for serious mental health issues.
Ethical and Safety Concerns
As Ms Lovleena, a clinical psychologist at BetterPlace, shares, AI tools aren't therapists. They haven't undergone clinical training, don't follow proper ethical guidelines, and can't deal with crisis situations. When someone is at risk of self-harm or experiencing a psychiatric emergency, a chatbot can't assess the severity, take action, or offer immediate care. At best, it may respond with generic advice, but it has no duty of care or accountability.
Then there's the question of data. Everything you share with an AI is stored somewhere. These tools are designed to learn from what users type, which means your personal thoughts, emotions, and mental health concerns may be used to train future models. This raises serious concerns about privacy and consent: Who owns that data? Where is it stored? What happens if it's leaked or misused? AI in mental health currently lacks clear regulation, safety standards, and oversight, posing real risks to vulnerable users without proper safeguards in place.
What You Can Do Instead
AI is here to stay; that is clear. As technology evolves, so will AI, and it is an excellent tool when used with standard techniques. It can help you track your mood, give you reminders, or suggest coping strategies based on your situation. But it cannot replace the connection, insight, and responsibility that come with a real human being.
A chatbot may sound understanding, but it doesn't know your story. It's not trained to guide a comprehensive treatment plan, and it isn't equipped to handle risks that come with mental health care. Comforting words are not the same as clinical support.
So what can we do instead?
Talk to a professional. Many therapists offer low-cost sessions, and some clinics use sliding-scale fees based on income.
Use AI as a tool, not a therapist. Let it help with reminders, journaling prompts, or creating schedules — but do not use it as your main form of support.
Explore community-based options like mental health NGOs, support groups, helplines, or therapy sessions from trainee psychologists offering free support.
What Comes After This?
AI may be the greatest invention of the modern era, and it certainly isn't going anywhere. As we learn to adapt to it and integrate it into our lives, it is important to know that we should only use it to support our treatment, not replace it.
A few sessions with a psychiatrist or a psychologist will help you much more than any AI chatbot could. So yes, let's embrace innovation. But let's also protect what matters most: our emotional safety, our personal stories, and our right to be truly seen and supported. AI can be a helpful tool, but when it comes to your mental health, there's no substitute for a real human being.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

We Asked ChatGPT For A Weight Loss Plan. An Expert Fact-Checks It
We Asked ChatGPT For A Weight Loss Plan. An Expert Fact-Checks It

NDTV

time2 hours ago

  • NDTV

We Asked ChatGPT For A Weight Loss Plan. An Expert Fact-Checks It

If you're on a weight loss journey, you already know it takes patience, determination, consistency and, of course, money. It's true. Losing weight can often dent your pocket. Organic groceries, gym memberships, personal trainer fees, Pilates classes, healthy snacks and meals--all come with a price tag. But can AI really help make weight loss easier, and perhaps cheaper? We decided to find out. And naturally, we turned to ChatGPT. Viral On Social Media Right now, social media is overflowing with tips on how to use ChatGPT for weight loss. From acting as your personal trainer to helping you pick the healthiest dish from your pantry, the claims are endless. But how much of this actually works? We put it to the test. ChatGPT As A Personal Trainer We asked ChatGPT to act as our personal trainer. It worked surprisingly well. ChatGPT immediately slipped into "coach mode" and asked six specific questions about my fitness level, goals, available equipment and schedule. Once I filled in the details, it generated a detailed weekly plan tailored to my needs. The structure was clear, easy to follow and, best of all, completely free. Have a look at GPT's response: ChatGPT As A Nutritionist If you're aiming to lose weight, eating right - ideally in a calorie deficit - is key. But hiring a professional nutritionist can be expensive. So, can ChatGPT create realistic, budget-friendly meal plans? We asked. We realised that the quality of the plan improves if you add a few key details to your prompt: How much time you can spend cooking each day Your location (state or country), so it can suggest region-specific foods The number of days you want the meal plan to cover Your diet preferences (vegetarian, vegan, etc) and any allergies GPT's response: Pro tip: Ask for your weekly plan in a printable format so you can keep it handy in the kitchen and actually follow it. What Does The Expert Think? On paper the workouts and the food might seem perfect, but Debjani Gupta, a nutritionist and wellness expert from Mumbai says it's far from perfect. "We should also ask for blood work to be done, which will enable us to gauge whether there is any nutrient deficiency or not. Most times, people get stuck in their progress as the body doesn't function optimally with nutrient deficiencies," Gupta says about the diet plan. She also says that the meals look fine but may not be supplying enough protein and the needed quantities of vegetables. "Our body, especially for women, needs a portion of healthy fats that is missing - I like to recommend nuts and seeds for the same," Gupta says. "We may not be able to generalise this diet pattern, though, as all individuals have some specific needs other than the generic ones. I'm not sure if AI can work on those specifics. For example, this morning I had to ask a girl to support her progesterone with nutrients like zinc, B6, magnesium, Omega-3 and healthy fats the next 5-7 days as she will take a hormone pill to induce period," she says. Debjani, however, says that she "quite likes" the workouts suggested, "This is fantastic cardio." More Prompts That You Can Try Here are some additional ChatGPT prompts that can help make your weight loss journey easier: "Suggest a 7-day vegetarian meal plan under 1,500 calories a day using only ingredients common in Indian households." "Give me 10 healthy Indian breakfast ideas under 300 calories each." "Create a 4-day home workout routine for fat loss, using only bodyweight exercises." "Recommend high-protein vegetarian snacks that cost less than Rs 50 per serving." "Suggest low-calorie dinner recipes I can cook in under 15 minutes." "List the healthiest ready-to-eat options available in Indian supermarkets." "Help me swap my favourite high-calorie dishes for healthier alternatives." "Create a progressive workout plan for a beginner, increasing difficulty every week for two months." "Plan a 30-minute daily walking routine with variations to keep it interesting." "Make a printable shopping list for a week's worth of healthy meals under Rs 2,000." Remember In the end, ChatGPT isn't a magic wand for weight loss, but it can be a surprisingly handy (and free) assistant. From planning your workouts to designing realistic meal plans, it can take the guesswork out of the process and keep you organised. But if you have specific needs, AI might not be the answer. Of course, AI can't replace medical or nutritional advice, nor the discipline, motivation and actual effort you need to put in, but if you know how to ask the right questions, it can make the journey a lot easier on both your mind and your wallet. Caution: take ChatGPT's advice with a pinch of salt (and maybe get it checked by a nutritionist in case you need the diet tailored to your specific requirements).

Couple follows ChatGPT for travel tips, what happened at the airport was a shock for them and a big lesson for others
Couple follows ChatGPT for travel tips, what happened at the airport was a shock for them and a big lesson for others

Economic Times

time4 hours ago

  • Economic Times

Couple follows ChatGPT for travel tips, what happened at the airport was a shock for them and a big lesson for others

Synopsis People are using AI chatbots for guidance. But experts warn against over-reliance. A Spanish couple missed their flight to Puerto Rico. ChatGPT gave them wrong visa advice. In another case, a man was hospitalised. He followed ChatGPT's dietary advice. He replaced table salt with a toxic substance. These incidents highlight the risks of trusting AI blindly. As artificial intelligence (AI) continues to advance rapidly, more people worldwide are turning to chatbots for guidance. However, experts caution against over-relying on AI tools for everyday decisions and problem-solving—a warning highlighted by a Spanish influencer couple's recent mishap. The pair ended up missing their flight after following travel advice from ChatGPT. — BrightlyAgain (@BrightlyAgain) In a viral video, Mery Caldass is seen crying while her boyfriend, Alejandro Cid, tries to comfort her as they walk through the airport. 'Look, I always do a lot of research, but I asked ChatGPT and they said no,' Caldass explained when asked if they needed a visa to visit Puerto Rico for a Bad Bunny concert. She said the chatbot assured them no visa was necessary but failed to mention that they required an ESTA (Electronic System for Travel Authorisation). Once at the airport, airline staff informed them they could not board without it. 'I don't trust that one anymore because sometimes I insult him [ChatGPT]. I call him a bastard, you're useless, but inform me well that's his revenge," she added, suggesting the chatbot held a is not the first time AI chatbot advice has gone wrong. According to a case study in the American College of Physicians Journals, a 60-year-old man was hospitalised after seeking dietary advice from ChatGPT on how to eliminate salt (sodium chloride) from his meals due to health concerns. Following the chatbot's suggestion, the man replaced table salt with sodium bromide—a substance once used in medicines during the early 1900s but now known to be toxic in large doses. Doctors reported he developed bromism as a result. "He had replaced sodium chloride with sodium bromide obtained from the internet after consultation with ChatGPT, in which he had read that chloride can be swapped with bromide, though likely for other purposes, such as cleaning," the report stated.

Good, bad and sick
Good, bad and sick

Time of India

time11 hours ago

  • Time of India

Good, bad and sick

As AI companies delete the health disclaimers, the chatbots are running amok playing doctor, doctor Doctors may be wishing they could return to the good old days of Dr Google. Because Dr AI is much more formidable competition and foe. What's dangerous is how a chatbot super personalises vast amounts of health data and then, via conversational intimacy, seduces the patient to take its advice when what's needed is seeing a real medical professional. In a case reported in Annals of Internal Medicine this month, a man developed bromide toxicity, a rare condition, after consulting ChatGPT about reducing salt in his diet. It advised him to swap sodium chloride with sodium bromide. The thing is, if he hadn't landed in the hospital, this wouldn't have gone on record. Last month, another US study found that compared to more than 26% of chatbot answers to health queries in 2022, less than 1% in 2025 contain some kind of warning about the LLM not being a doctor. This finding is especially troubling as the Stanford and US researchers used the likes of mammograms and chest X-rays to screen for disclaimer phrases. It indicates how much AI companies are encouraging people to use it for health advice. At the GPT-5 launch event, for example, a person spoke about uploading her biopsy results onto it and taking its help to decide whether to pursue radiation. Actual medical advice is not plain pattern matching. It engages with the complexity of individual medical histories and circumstances. AI can be useful in the health sector in many ways. But Indians already pop enough pills prescribed only by their neighbourhood chemist. Substituting professional medical judgement with chatbot advice would worsen their health horribly. GenAI models, even as they become more capable and authoritative, must also use proper disclaimers and safeguard against the dangers of providing diagnosis in place of a doctor. Facebook Twitter Linkedin Email This piece appeared as an editorial opinion in the print edition of The Times of India.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store