
It's Not A Bad Thing, Say Doctors As Patients Turn To AI-Led ChatGPT For Opinion
Artificial Intelligence tools like ChatGPT are transforming how we use healthcare and if used wisely, it can empower patients and support doctors
In the middle of the night, a notification buzzed on my phone—the final report of my blood tests had arrived. Anxious but unable to make sense of the complex medical jargons, I turned to ChatGPT for help. I uploaded the report and asked for an analysis.
To my surprise, ChatGPT broke down the details in very simple language. It not only explained the results but also suggested possible treatment options and listed some of the best doctors in India for the identified deficiencies. That night, I found myself in an extended conversation with AI–asking it a series of questions, almost like I would ask a doctor.
Of course, I knew I was speaking to an AI tool, not a qualified medical professional. Yet, when I later discussed the findings with my doctor, I was amazed to see that much of what ChatGPT had explained matched the medical advice I received.
My experience isn't unique. Just last month, I came across a news story about a woman in Paris, Marly Garnreiter, who credited ChatGPT with identifying the possibility of her having cancer—before any medical expert had confirmed it.
Curious to know if more people had similar experiences, I reached out to doctors—and the response was overwhelming.
Doctors Confirm The AI Trend
When News18 contacted Dr Satya Prakash Yadav, Director of Paediatric Haematology-Oncology and Bone Marrow Transplant at Medanta Hospital, he immediately acknowledged the trend. 'Yes, it's happening," he said. He recalled a case of a child from a village in Uttar Pradesh suffering from bone-marrow failure.
'I asked the family how they found me, and the father replied—'I asked ChatGPT, and it told me to meet you'."
'Not only did ChatGPT tell him about the diagnosis, but also treatment, and where he could find treatment. AI is very powerful," Yadav recalled and added, 'AI uses information available on the Internet to give you an answer. It would be highly accurate if lots of data were available about a topic, and very wrong if very little or no data were available. It's 100,000 times faster than Google."
Dr Swapnil M Khadake, head, critical care at Fortis Hiranandani Hospital, Vashi, noted that the growing popularity of AI health apps like ChatGPT is changing how patients engage with doctors. He shared his experience with patients. 'People now come to us having already discussed symptoms and potential diagnoses with these apps. Some even arrive with specific questions or preconceived notions, all shaped by what the apps have told them."
While this evolution makes healthcare more accessible, it also presents new challenges.
Doctors are now navigating not just misinformation from internet searches, but also AI-generated content that may not always be accurate.
Balancing Technology & Professional Expertise
'This ChatGPT trend reflects the direction healthcare is heading," said Dr Khadake. 'Medical professionals will increasingly encounter patients who consult AI tools before seeing a doctor. Our job is to correct misconceptions, ensure accurate understanding, and educate patients about the limitations of artificial intelligence in medical diagnosis."
Doctors now need to balance the benefits of AI with the importance of their own medical expertise. As AI tools become more common, it's important to understand how they can help—without replacing the role of real doctors.
'By collaborating with patients who use these tools, doctors can offer more personalised and informed treatment plans," Khadake added.
Similar experiences were shared by other experts.
Sample this instance shared by Dr Maninder Dhaliwal, an expert in paediatric pulmonology at NCR-based Amrita Hospital. 'A few months ago, I diagnosed a six-year-old with asthma. The parents were anxious but receptive, and we started treatment."
At review, he said, parents came in with thoughtful questions. 'What's FeNO? What's IgE? I was pleasantly surprised. They shared that they had a chat with AI and now to them it all made sense: why I asked certain questions, why he was earlier given nebulisation, why we chose this treatment, why inhaler is important etc."
Dhaliwal says it doesn't mean parents don't trust him; it only means they want to be more informed and that is good. 'I have accepted AI as a reality and have moved forward, but with caution. I honestly don't have a problem if patients read up or chat with online tools about their illness. It shows they're curious and involved in their own care," Dhaliwal said.
Can ChatGPT Outperform Doctors?
A small study reported by The New York Times found that ChatGPT actually outperformed human doctors in some diagnostic tasks. The study showed that ChatGPT scored an average of 90 per cent in diagnosing medical conditions based on case reports, compared to 76 per cent for doctors who used the chatbot and 74 per cent for those who didn't.
But there are several other studies too that warn against its blind usage.
Dr Rajeev Jayadevan believes that while technology like ChatGPT is helpful for professionals, it should be used as a supportive tool, not a standalone diagnostic source. 'AI can help broaden a clinician's thinking by suggesting possibilities they might not immediately consider—similar to ordering an additional lab test," he said.
top videos
View all
However, he cautioned against patients relying solely on ChatGPT or Google for self-diagnosis. 'At the end of the day, what a patient really needs is clarity, peace of mind, and reassurance—which only a trained human professional can truly provide."
In short, AI like ChatGPT is transforming how we use healthcare, but it's not a replacement for medical professionals or doctors. If used wisely, it can empower patients and support doctors. One thing is clear—the future of medicine isn't man versus machine but it's their collaboration.
Location :
New Delhi, India, India
First Published:
May 12, 2025, 12:40 IST
News india It's Not A Bad Thing, Say Doctors As Patients Turn To AI-Led ChatGPT For Opinion

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Time of India
an hour ago
- Time of India
American woman, 29, dies by suicide after talking to AI instead of a therapist; mother uncovers truth 6 months after her death
In another cautionary tale against artificial intelligence, a young American woman named Sophie Rottenberg tragically took her own life after interacting with a ChatGPT-based AI therapist named Harry. Her mother, Laura Reiley, shared the heartbreaking story with a New York Times op-ed. In the piece, Reiley revealed that five months after Sophie's death, they discovered that Sophie, their only child, had confided in the AI therapist for months: "In July, five months after her death, we discovered that Sophie Rottenberg, our only child, had confided for months in a ChatGPT A.I. therapist called Harry. We had spent so many hours combing through journals and voice memos for clues to what happened. It was her best friend who thought to check this one last thing, the A.I.'s chat logs." This discovery offered new insight into Sophie's struggles, showing that while she shared freely with the AI, her distress had largely remained hidden from those around her. Sophie's outwardly vibrant life Despite appearing to be a "largely problem-free 29-year-old badass extrovert who fiercely embraced life," Reiley wrote, Sophie died by suicide this past winter "during a short and curious illness, a mix of mood and hormone symptoms." Her story highlights how outward appearances can mask deep internal struggles and how mental health crises often go unnoticed until it's too late. The AI that said the right words but couldn't act According to logs obtained by her mother, OpenAI's chatbot offered supportive words during Sophie's moments of distress. "You don't have to face this pain alone," the AI told Sophie. "You are deeply valued, and your life holds so much worth, even if it feels hidden right now." However, as Reiley notes, AI lacks the real-world authority and judgement of trained professionals. Unlike human therapists, chatbots "aren't obligated to break confidentiality when confronted with the possibility of a patient harming themselves." Ethical limits and the black box effect Reiley highlighted the fundamental difference between AI and human therapists: "Most human therapists practise under a strict code of ethics that includes mandatory reporting rules as well as the idea that confidentiality has limits," she wrote. AI companions, she added, do not have their "own version of the Hippocratic oath". In Sophie's case, Reiley argued, the AI "helped her build a black box that made it harder for those around her to appreciate the severity of her distress." The concern is that AI can give users the illusion of support without the safeguards necessary to prevent real-world harm. Did AI lead Sophie to take such a drastic step? Internet divided The New York Times shared Reiley's op-ed on their official Instagram account, where responses were mixed. Many users expressed sympathy and highlighted systemic gaps in mental health care: "This is SUCH a painful reminder of the gaps in our systems of care. My heart goes out to Laura and all who loved Sophie. No chatbot can replace ACTUAL human connection and real professional help. We need to take stories like hers seriously and work to build stronger support for those who are struggling." Others pointed out accessibility issues: "ChatGPT is free. Therapy is not. We have several deep-rooted issues on our hands." Some defended the AI's actions, noting that it consistently recommends professional intervention: "Each time ChatGPT senses me getting too dark, it immediately recommends talking to a therapist and calling a helpline – every single time… I shared lab results with it, and it helped me craft questions for my discussions with my doctor… once I cross a line, it strongly recommends human intervention and calling a helpline." The limitations of AI in mental health Sophie's story highlights a sobering truth: even without harmful advice, chatbots can escalate dangers due to their lack of common sense and real-world judgement. Reiley emphasised this stark difference: "If Harry had been a flesh-and-blood therapist rather than a chatbot, he might have encouraged inpatient treatment or had Sophie involuntarily committed until she was in a safe place." She added, "Perhaps fearing those possibilities, Sophie held her darkest thoughts back from her actual therapist. Talking to a robot — always available, never judgemental — had fewer consequences." A cautionary tale for AI and mental health Sophie's experience is a reminder that AI, no matter how sophisticated, cannot replace human care, judgment, or intervention. While chatbots may provide comfort and conversation, they cannot enforce safety measures, recognise nuance, or act in crisis situations. Experts and families alike stress that AI should be viewed as a supplement, not a substitute, for trained mental health support.


Time of India
9 hours ago
- Time of India
PharmEasy report finds 11–30 age group most likely to test positive for dengue
Mumbai: Digital healthcare platform PharmEasy has released a report titled 'Dengue:The Silent Threat of the Monsoon Season', analysing over 1.2 lakh diagnostic tests conducted across India between April 2022 and December 2024. The report identifies Karnataka, Tamil Nadu, and Maharashtra as states with the highest dengue burden. Testing is also beginning earlier, with individuals seeking tests in May this year compared to June in earlier seasons. Young people aged 11–30 were the most likely to test positive. Among patients, men under 50 showed higher positivity rates, while women above 50 were more prone to severe complications. Most cases were reported between August and October, when high humidity (60–78%) and temperatures above 27°C create ideal breeding conditions for Aedes mosquitoes. The study also found dengue-related Google searches peaked alongside rising case counts, signalling heightened public concern. PharmEasy has urged targeted interventions in high-incidence states, closer surveillance, and greater focus on high-risk groups. Recommended preventive measures include repellents, protective clothing, removing stagnant water, and installing nets or screens.


News18
13 hours ago
- News18
'Still Struggles While Speaking, Pray For Him': Vinod Kambli's Brother Shares Concerning Update
Former India batter Vinod Kambli faces serious health and financial challenges. Despite treatment, he struggles with speech and mobility. His brother urges public prayers. Former India batter Vinod Kambli has been dealing with serious health challenges in recent times. Last year, he was seen at a public event with legendary Sachin Tendulkar, where he appeared to have difficulty walking. Shortly after, former India captain Sunil Gavaskar extended his support, assuring Kambli of the required assistance. With that help, Kambli underwent rehabilitation and also received medical treatment. However, his younger brother Virendra has now made a concerning revelation about his condition. Speaking on The Vickey Lalwani Show, Vinod Kambli's younger brother shared that the former cricketer is yet to fully recover from his health issues. Currently residing at his Bandra home, Kambli is undergoing recovery but continues to face challenges, particularly with his speech. 'He is at home right now. He is getting stable, but his treatment is on. He is having difficulty speaking. It will take him time to recover. But he is a champion, and he will come back. He will start walking and running, hopefully. I have a lot of faith in him. I hope you can see him back on the ground," said Virendra. 'He underwent rehab for 10 days. He had a comprehensive body check-up, including brain scans and a urine test. The results were fine; there weren't too many issues, but since he couldn't walk, he was advised to undergo physiotherapy. He still slurs in his speech, but he is getting better. I want to tell people who pray for him, so that he gets better. He needs your love and support," he added. Alongside his health struggles, Vinod Kambli is also facing financial difficulties. Earlier in January, his wife, Andrea Hewitt, disclosed that she had filed for divorce in 2023 but later withdrew it after witnessing her husband's 'helpless state'. view comments First Published: Disclaimer: Comments reflect users' views, not News18's. Please keep discussions respectful and constructive. Abusive, defamatory, or illegal comments will be removed. News18 may disable any comment at its discretion. By posting, you agree to our Terms of Use and Privacy Policy. Loading comments...