12 hours ago
Comfort or isolation: Pakistanis weigh pros and cons of ChatGPT as confidant
LAHORE: When Mehak Rashid looks back on a restless, emotionally fragile phase of her life earlier this year, an unlikely confidant comes to mind.
'When nobody else was listening to you and everybody else thought you were crazy, ChatGPT was there,' Rashid, a metallurgy and materials engineer from Lahore, told Arab News.
'I just wanted to be heard… It will not give you a judgment and that's so beautiful.'
Rashid began using the chatbot after noticing her children experimenting with it for schoolwork. Now, she often turns to it for 'answers' and 'different perspectives.'
'It helps me in every way,' she said.
Since its launch in November 2022, ChatGPT has attracted hundreds of millions of users and, by mid-2025, logged nearly 800 million weekly active users. Many in Pakistan, among the top 20 countries for ChatGPT traffic, use it daily for emotional support, venting feelings, or late-night reassurance when friends aren't available.
Globally, an estimated 40 percent of ChatGPT conversations relate to mental well-being, and a Sentio University survey found nearly half of users with ongoing mental health issues rely on it for support: 73 percent for anxiety, 63 percent for advice, and 60 percent for help with depression.
While this instant comfort helps some cope, psychologists warn that heavy reliance on AI can weaken real human connections and deepen social isolation in a country already short on mental health resources.
A March 2025 study by OpenAI and MIT found frequent users reported increased dependence and loneliness, suggesting that AI companionship can erode human bonds and intensify feelings of isolation rather than resolve them.
For Lahore-based designer Khizer Iftikhar, ChatGPT began as a professional aid but gradually crept into his personal life and started affecting his relationships, especially with his wife.
'I have a very avoidant attachment style,' he said. 'Instead of confronting someone, I can just talk about the good part with people and let the chatbots handle the negative part.'
Iftikhar described ChatGPT as 'a multiple personality tool' that lacked the balance of real human interaction.
Many experts say using AI models can weaken bonds overtime, reduce empathy, and make people more emotionally self-contained, preferring the predictable reassurance of a machine over the give-and-take of genuine human connection.
'With humans, relationships are about give and take. With chatbots, it's not like that,' Iftikhar said.
Despite once trying therapy, he now uses ChatGPT to process emotions and trusts people only for practical advice.
'I would trust a chatbot more when it comes to the feelings part,' Iftikhar said. 'But when it comes to the work part, I will trust humans more.'
In Islamabad, 26-year-old Tehreem Ahmed initially used ChatGPT for office transcriptions and calorie tracking but it eventually became an emotional lifeline.
One night, overwhelmed by troubling news and unable to reach friends, she turned to the chatbot.
'It was around 3am and none of my friends were awake,' she said. 'So, I went on ChatGPT and I typed in all that I got.'
The chatbot encouraged her to pause and reflect before reacting.
'I feel like it responded well because I gave it a smarter prompt… Had I just said, 'Hey, this has happened. What should I do?' I guess it would have just given me all the options… I could have self-sabotaged.'
While Ahmed doesn't fully trust the bot, she said she preferred it to people who might dismiss her feelings.
'If I know my friend is not going to validate me, I'd rather go to the bot first.'
'DETERIORATING HUMAN CONNECTIONS'
For one anonymous Lahore-based tech professional, ChatGPT quickly shifted from a practical helper to an emotional crutch during a difficult relationship and the ongoing war in Gaza.
She first used it in late 2023 to navigate a job change, edit CVs, and prepare for assessments. But emotional upheaval deepened her reliance on the bot.
'That [romantic] relationship didn't progress,' she said. 'And the platform helped me a lot emotionally in navigating it.'
Her sessions became so layered and spiritual that some ended in 'prostration from spiritual overwhelm.'
Still, she was careful not to project too much onto the tool:
'It's a mirror of my flawed self… I try not to let the tool simply reflect my ego.'
Psychologists caution that without the challenges and messiness of real interactions, people using chatbots may lose vital social skills and drift further into isolation.
Mahnoor Khan, who runs MSK Clinics in Islamabad, agreed, saying the search for emotional safety in AI was becoming increasingly common as people feared judgment from others.
'Over a period of time, human connections have deteriorated,' the psychologist said. 'When people share something vulnerable with a friend, they often feel judged or lectured.'
To avoid that, many turn to chatbots. But Khan warned that AI's constant affirmation could have unintended consequences.
'It will tell you what you want to listen to… If you're happy, it's your companion; if you're sad, it instantly talks to you. The downside is that you are getting away from socialization.'
The trend is especially troubling in a country where mental health care remains deeply under-resourced: Pakistan has fewer than 500 psychiatrists for a population of over 240 million, according to WHO estimates.
No wonder then that even people with clinical mental health issues were turning to AI.
Khan recalled the case of a young woman who used ChatGPT so often that it replaced nearly all her social interaction.
'She had a lot of suicidal ideations,' Khan said. 'She kept feeding ChatGPT: 'I feel very depressed today… you tell me what I should do?' ChatGPT kept telling her to avoid friends like that.'
Eventually, she cut everyone off.
One day, she asked the chatbot what would happen if she overdosed on phenyl.
'ChatGPT said, 'There are no consequences. In case you overdose yourself, you might get paralyzed,'' Khan recalled.
The girl only read the first half and attempted suicide.
She survived.