logo
#

Latest news with #MuhammadHaikalJamil

Hear Me Out: ChatGPT is not my friend, but it lends a late-night listening ear
Hear Me Out: ChatGPT is not my friend, but it lends a late-night listening ear

Straits Times

time10-05-2025

  • Straits Times

Hear Me Out: ChatGPT is not my friend, but it lends a late-night listening ear

There is a growing trend in Gen Z clients turning to AI tools, such as ChatGPT, as the 'first line of support for their emotional struggles'. PHOTO: REUTERS Hear Me Out: ChatGPT is not my friend, but it lends a late-night listening ear SINGAPORE – It was 2am and I was in bed in tears after what, in hindsight, was a petty fight with my mother. She had reprimanded me for coming home late, and I snapped back, insisting that at 23 years old, I should not have to adhere to curfews. My friends would have recognised this as an all-too-familiar rant topic. But that night, I didn't reach out to my closest confidantes. I was afraid of burdening them with the same story again. Instead, I did something unexpected: I reinstalled ChatGPT. I poured out a long, meandering rant into the chat box and hit send. To my surprise, the artificial intelligence (AI) app didn't just offer generic platitudes. It validated my feelings, pointed out patterns I hadn't noticed and gently nudged me towards reflection. It felt like someone was listening, even if that someone wasn't real. Later, in conversation with a real friend, I sheepishly confessed what I did. She admitted she had used ChatGPT for emotional support before. So had another. And another. Apparently, I wasn't alone. Three years ago, I took a class on AI law. ChatGPT was in its nascent stages, and most discussions about AI felt abstract and far away. We discussed issues of driverless cars, deepfakes and the evolution of AI over the years. Back then, the movie Her (2013), American film-maker Spike Jonze's dystopian love story between man and machine, still felt like a metaphor. Now, I'm not so sure. While people may not yet be falling in love with their chatbots, they are turning to them for something deeply intimate: comfort. Even in the recent general election, candidates of some political parties instructed supporters to pull up ChatGPT during rallies to compare manifestos in real time. It was a quick-fire way to seek validation, affirm their arguments and appeal to voters' emotions on the spot. Let me be clear. I do not endorse the unchecked and frequent use of AI, especially given its environmental toll and ethical concerns. But we can't ignore the growing reality that, for a generation raised on digital immediacy, AI is fast becoming a tool for productivity. But does this apply to the way we process our emotions too? So, I began to wonder: What does that say about this generation growing up with AI at our fingertips? Do we crave instant validation? Are we avoiding difficult conversations? What does it mean when we talk to a chatbot as if it were a friend, and what does it mean when it talks back like one? To understand what this shift says about us, and what it might mean for the future of emotional care, I spoke to a few mental health professionals. Senior clinical psychologist Muhammad Haikal Jamil, founder of ImPossible Psychological Services, has observed a growing trend in Gen Z clients turning to AI tools as the 'first line of support for their emotional struggles'. The Lighthouse Counselling's principal therapist Belinda Lau adds that AI probably breaks down some barriers, such that people don't need to feel as awkward or embarrassed to share more deeply about how they genuinely feel. That rings especially true for me. There's a strange relief in being able to send a raw, misspelt rant to a chatbot. One that is unfiltered, unstructured, typed in the heat of emotion. I don't worry about sounding articulate. I don't feel the need to soften my words or present a balanced view. Even when I know I might be in the wrong, I can still ask for advice without fearing judgment. There's comfort in that kind of emotional anonymity while still receiving validation from the listening party. But why is this trend of resorting to AI particularly visible among Gen Zs? Mr Haikal believes it is partly due to growing up in the age of immediacy, which has shaped how younger people deal with distress. However, seeking quick relief may steer them towards unhelpful coping strategies rather than more sustainable, though slower, methods. While AI tools offer immediate validation, they could fall short in the long run, he warns. '(Users) continue struggling with their emotions if these emotions are intense. They continue to feel empty or alone after communicating with the AI tools. The strategies offered may also be insufficient to alleviate their emotions,' he says. Still, he sees signs of progress. By expressing ourselves, albeit to a chatbot, it indicates this generation is not only more aware of mental health, but also more willing to be open and vulnerable. 'This is different from the previous generations, where individuals are more likely to detach and push their feelings aside.' So, maybe it is not the most harrowing thought that, in the wee hours of the night, we turn to a chatbot to vent. Perhaps the simple desire to be heard is what makes us human. Still, I'll admit nothing quite compares to a real debrief session with my friends. The kind that ends in knowing nods, laughter and hugs, and where I can show them what ChatGPT said, and we sit and evaluate together. So, robots aren't taking over any time soon. But thank you, ChatGPT, for replying with 'I hear you, that really sucks' whenever times are tough. Hear Me Out is a new series where young journalists (over)share on topics ranging from navigating friendships to self-loathing, and the occasional intrusive thought. Join ST's Telegram channel and get the latest breaking news delivered to you.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store