logo
‘Won't get annoyed, won't snap': Indonesians tap AI for judgement-free emotional support, but risks abound

‘Won't get annoyed, won't snap': Indonesians tap AI for judgement-free emotional support, but risks abound

CNA5 hours ago

JAKARTA: Ahead of an extended family gathering, Nirmala (not her real name) found herself unusually anxious.
The reason: Small talk that could spiral into interrogation.
'Sometimes I just don't know how to answer questions from relatives, and that stresses me out,' said Nirmala, 39, who asked to remain anonymous.
In contrast, the generative artificial intelligence platform ChatGPT has been nothing but a source of comfort ever since Nirmala began using it as a sounding board last October.
'It's not that I don't have anyone to talk to,' Nirmala told CNA Indonesia. 'But when I bring up things that people think are trivial, I'm often told I'm being dramatic. So I talk to AI instead – at least it listens without throwing judgement.'
Like Nirmala, overseas student Ila (not her real name) has turned to AI-driven chatbots for advice.
Ila, 35, first turned to ChatGPT in April 2023 when she was preparing to move abroad for further studies. She later began also using Chinese AI platform DeepSeek.
At first, Ila – who also requested anonymity – used the platforms for practical information about university life and daily routines in her host country, which she declined to reveal.
'Before leaving for school, I had a ton of questions about life abroad, especially since I had to bring my children with me. AI became one of the ways I could gain perspective, aside from talking directly with people who'd already been through it,' she said.
The platforms' replies put her at such ease that in October last year, she began sharing her personal issues with the chatbots.
NO JUDGEMENT FROM CHATBOTS
AI chatbots have taken the world by storm in recent years and more people are turning to them for mental health issues.
Indonesia is no different. An online survey in April by branding and data firm Snapcart found that 6 per cent of 3,611 respondents there are using AI "as a friend to talk to and share feelings with". Nearly six in 10 (58 per cent) of respondents who gave this answer said they would sometimes consider AI as a replacement for psychologists.
People in Southeast Asia's largest economy are not necessarily turning to AI chatbots because they lack human friends, but because AI is available 24/7 and "listens" without judgement, users and observers told CNA Indonesia.
The tool, they said, is especially handy in a country with a relatively low number of psychologists.
According to the Indonesian Clinical Psychologists Association, the country has 4,004 certified clinical psychologists, of whom 3,084 are actively practising.
With a population of about 280 million people, this translates to about 1.43 certified clinical psychologists per 100,000 population.
In comparison, neighbouring Singapore has 9.7 psychologists per 100,000 population – a ratio that is already lower than in other Organisation for Economic Cooperation and Development nations.
The potential benefits of using AI in mental health are clear, experts said, even as risks and the need for regulation exist.
The rise of AI as a trusted outlet for emotional expression is closely tied to people's increasingly digital lives, said clinical psychologist Catarina Asthi Dwi Jayanti from Santosha Mental Health Centre in Bandung.
AI conversations can feel more intuitive for those who grew up with texting and screens, she said, adding that at least a dozen clients have told her they have consulted AI.
"For some people, writing is a way to organise their thoughts. AI provides that space, without the fear of being judged," she said.
Conversing with ChatGPT is a safe way of rehearsing her thoughts before opening up to somebody close to her, Nirmala said. "Honestly it doesn't feel like I'm talking to a machine. It feels like a conversation with someone who gets me," she said.
AI chatbots offer accessibility, anonymity, and speed, said telecommunications expert Heru Sutadi, executive director of the Indonesia ICT Institute.
AI platforms, he said, are "programmed to be neutral and non-critical".
"That's why users often feel more accepted, even if the responses aren't always deeply insightful," he said.
Unlike a session with a psychologist, "you can access AI 24/7, often at little to no cost", Heru said. "Users can share as much as they want without the pressure of social expectations. And best of all, AI replies instantly."
In Indonesia, an in-person session with a private psychologist can cost upwards of 350,000 rupiah (US$21.50).
Popular telemedicine platform Halodoc offers psychiatrist consultations at prices starting from 70,000 rupiah, while mental health app Riliv offers online sessions with a psychologist at prices starting from 50,000 rupiah.
Another advantage of a chatbot, said Ila, is that it "won't get annoyed, won't snap, won't have feelings about me bombarding it with a dozen questions".
"That's not the case when you're talking to a real person," she added.
As such, AI can serve as a "first safe zone" before someone seeks professional help, especially when dealing with topics such as sexuality, religion, trauma or family conflict, said Catarina.
"The anonymity of the internet, and the comfort that comes with it, allows young people to open up without the fear of shame or social stigma," she explained.
Some of her clients, she added, turned to AI because they "felt free to share without worrying what others, including psychologists, might think of them, especially if they feared being labelled as strange or overly emotional."
RISKS AND IMPACT ON REAL-LIFE RELATIONSHIPS
But mental health professionals are just as wary of the risks posed by AI chatbots, citing issues such as privacy, regulation of the technology and their impact on users' real-life interactions with others.
The machines can offer a false sense of comfort, Heru said. "The perceived empathy and safety can be misleading. Users might think AI is capable of human warmth when, in reality, it's just an algorithm mimicking patterns."
Another major concern is data privacy, Heru said. Conversations with AI are stored on company servers and if cyber breaches occur, "sensitive data could be leaked, misused for targeted advertising, profiling, or even sold to third parties".
For its part, Open AI, ChatGPT's parent company, has said: "We do not actively collect personal information to train our models, do not use public internet data to profile individuals, target advertising, or sell user data."
Indonesia released a National Strategy for Artificial Intelligence in 2020, but the document is non-binding. AI is currently governed loosely under the 2008 Electronic Information and Transactions (ITE) Law and the 2022 Personal Data Protection Law, both of which touch on AI but lack specificity.
A Code of Ethics for AI was issued by the Ministry of Communication and Digital Affairs in 2023, but its guidelines remain vague.
In January this year, Communication and Digital Affairs Minister Meutya Hafid announced comprehensive AI regulations would be rolled out.
Studies are also emerging on the impact of chatbot usage on users' real-life social interactions.
In a 2024 study involving 496 users of the chatbot Replika, researchers from China found that greater use of AI chatbots, and satisfaction with them, could negatively affect a person's real-life interpersonal skills and relationships.
Child and adolescent clinical psychologist Lydia Agnes Gultom from Klinik Utama dr. Indrajana said AI-based relationships are inherently one-sided. Such interactions could hinder people's abilities to empathise, resolve conflicts, assert themselves, negotiate or collaborate, she said.
"In the long run, this reduces exposure to genuine social interaction," said Agnes.
In other countries, experts have highlighted the need for guardrails on the use of AI chatbots for mental health.
As these platforms tend to align with and reinforce users' views, they may fail to challenge dangerous beliefs and could potentially drive vulnerable individuals to self-harm, the American Psychological Association told US regulators earlier this year.
Safety features introduced by some companies, such as disclaimers that the chatbots are not "real people", are also inadequate, the experts said.
AI can complement the work of mental health professionals, experts told CNA Indonesia.
It can offer initial emotional support and a space for humans to share and explore their feelings with the right prompts, said Catarina of Santosha Mental Health Centre.
But when it comes to diagnosis and grasping the complexity of human emotions, AI still falls short, she said. "It lacks interview (skills), observation and a battery of assessment tools."
AI cannot provide proper intervention in emergency situations such as suicide ideation, panic attacks or abuse, said Agnes of Klinik Utama dr. Indrajana, a healthcare clinic in Jakarta.
Therapeutic relationships rooted in trust, empathy, and nonverbal communication can only happen between humans, she added.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Premium ANC headphones that take quiet to the next level
Premium ANC headphones that take quiet to the next level

CNA

time19 minutes ago

  • CNA

Premium ANC headphones that take quiet to the next level

The premium noise-cancelling headphone space is fiercely competitive. Superior sound and noise cancelling are expected. Discerning users also demand smart features, luxurious materials and flawless connectivity. Whether you're chasing studio-grade fidelity, cutting-edge innovation or the bliss of sonic isolation, there's a perfect pair waiting for you. To make your search easier, we've curated a selection of standout active noise-cancelling (ANC) headphones that deserve a spot on your shortlist, like the Sony WH-1000XM6 we recently tested. If battery life matters most, look no further. The Momentum 4 offers up to 60 hours of playtime with ANC and Bluetooth activated. The padded leather headband and exposed metal sliders of previous Momentums have given way to a wider headband with extra padding. It supports aptX Adaptive, a higher-quality Bluetooth audio codec compatible with older standards like aptX and AAC. It's Bluetooth 5.2-compliant and supports multi-point connectivity. 60 hours playtime with ANC Bluetooth 5.2 293g Pros: Excellent sound quality, battery life, comfortable design, premium build, comfortable Shop here JBL Tour One M3 (S$399) The recently launched JBL Tour One M3 offers up to 40 hours of playtime with ANC. It supports LDAC, a high-resolution audio codec. Spatial audio with head tracking makes for more immersive movies and gaming. Bluetooth 5.3 LE enables low-power, dual-device pairing with seamless switching. It also supports Auracast, which receives audio broadcasts from compatible sources via the JBL app. 40 hours playtime with ANC Bluetooth 5.3 LE Weight 278g Pros: Good sound, effective ANC, comfortable Shop here These Bluetooth 5.3-compliant headphones offer up to 24 hours of battery life with ANC on. It's known for its noise-cancelling chops although it's facing stiff competition from newer rivals. While its default sound may not be to everyone's liking, it can be customised via the Bose app. Its well-padded headband and ear-cups offer a comfortable fit for extended listening sessions. SimpleSync syncs with compatible Bose soundbars so you can listen at your own volume. 24 hours battery life with ANC Bluetooth 5.3 Pros: Good noise cancellation, good sound, adjustable EQ, comfortable, supports aptX high-quality audio codec Shop here Sonos is known for its soundbars and speakers so its entry into the premium headphone space was highly anticipated. It features Dolby Atmos support with compatible Sonos soundbars and head tracking for more immersive listening. It offers up to 30 hours of playback with ANC and has fast-charging capabilities. Connectivity is limited to Bluetooth, with no Wi-Fi support. The Sonos app offers basic EQ customisation, which may not suit users who like to adjust their sound profiles. 30 hours playtime with ANC Bluetooth 5.4 312g Pros: Supports Dolby Atmos and head tracking, good build quality and noise cancelling Apple Airpods Max (S$749) Apple's premium cans received a refresh in late 2024 that replaced the lightning port with USB-C and added support for lossless audio on compatible devices. The headphones still offer up to 20 hours of playback with ANC and spatial audio activated. Apple's H1 chip is installed in each ear cup and helps provide effective ANC that makes movies more immersive. To access its full range of features like Personalised Spatial Audio with dynamic head tracking though, the Airpods Max is best used within Apple's ecosystem of devices and services. Bowers & Wilkins PX8 (S$945 Usual Price:S$999) Beyond great audio quality and good noise cancelling, the PX8 is also known for its premium finish and comfort. Aluminium arms help reduce weight while earpads, headband and earcups wrapped in Nappa leather (which allows better breathability than synthetic leather) offer a comfortable fit. It offers up to 30 hours of battery life with ANC. The app links to streaming services for a more seamless listening experience. This online deal is an import from Germany – you'll need to check if manufacturer warranties apply. Pros: Premium build quality and materials, good sound and noise cancellation Cons: Pricey Shop here Sony WH-1000XM6 (S$649) We tested the WH-1000XM6 and found plenty to like. It boasts class-leading active noise cancelling, great sound and effective touch controls. It supports LDAC, AAC, and SBC codecs, and offers multi-point Bluetooth connectivity. It also includes adaptive sound control, speak-to-chat and 360 Reality Audio support. The previous gen Sony WH-1000XM5 are still credible options, especially if you can find flash deals like this. Battery life: Max 30 hours (ANC on), 40 hours (ANC off) Bluetooth 5.3 Weight: 254g Pros: Excellent noise cancelling, audio quality, foldable design Shop here WHAT TO LOOK FOR IN PREMIUM ANC HEADPHONES Comfort and build: Premium materials like memory foam, leather and aluminium enhance long-term comfort and durability. Well-designed headphones can reduce pressure points and will be more comfortable on long rides or flights. Active noise cancellation (ANC): Look for adaptive or hybrid ANC systems that adjust to your environment in real time. Effective ANC should be a staple in this segment. Sound quality: Prioritise headphones with a good, balanced sound signature. This is subjective, so if you don't like the default sound signature, the headphone app should have customisable EQ settings. Look for high-resolution audio support as well. At this end of the market, there's really no excuse for middling audio. Battery life: Aim for at least 25-30 hours of playback with ANC on, with fast-charging capabilities. Connectivity: Features like Bluetooth multipoint, Auracast, and USB-C audio support can improve versatility. Smart features: Voice assistant integration and customisable touch controls add convenience. Wear detection, which automatically stops playing a track when you remove the headphones, are another good-to-have feature.

Amazon CEO says AI will reduce tech giant's corporate workforce
Amazon CEO says AI will reduce tech giant's corporate workforce

Straits Times

time4 hours ago

  • Straits Times

Amazon CEO says AI will reduce tech giant's corporate workforce

Amazon CEO Andy Jassy said he expects the company's corporate workforce to decline in the next few years as it uses AI to handle more tasks. PHOTO: REUTERS Seattle – chief executive officer Andy Jassy says he expects the company's corporate workforce to decline in the next few years as the retail and cloud-computing giant uses artificial intelligence (AI) to handle more tasks. Generative AI and AI-powered software agents 'should change the way our work is done,' Mr Jassy said in an email to employees on June 17 that laid out his thinking about how the emerging technology will transform the workplace. 'We will need fewer people doing some of the jobs that are being done today, and more people doing other types of jobs,' Mr Jassy wrote. 'It's hard to know exactly where this nets out over time, but in the next few years, we expect that this will reduce our total corporate workforce as we get efficiency gains from using AI extensively across the company.' From the start of the AI boom, people inside and outside the industry have raised concerns about the potential for AI to replace workers. Those concerns have only grown as tech companies introduce more sophisticated AI systems that can write code and field online tasks on a user's behalf. Shopify told employees that requests for new headcount will require an explanation as to why AI can't do the job. Duolingo said it would 'gradually stop' using contractors to do work that AI can handle. And Microsoft recently announced a round of layoffs that hit software developers hardest. Dario Amodei, CEO of OpenAI rival Anthropic, recently warned that AI could wipe out half of all entry-level white-collar jobs and cause unemployment to spike to as high as 20 per cent over the next five years. Amazon, which has prioritised automation in logistics and headquarters roles for years, is investing heavily in AI. Mr Jassy, in his letter, rattled off some of those initiatives, including the Alexa+ voice software, a shopping assistant, and tools for developers and businesses sold by the Amazon Web Services cloud unit. Inside the company, Amazon has used AI tools for inventory placement, customer service and product listings. Mr Jassy encouraged employees to 'experiment with AI whenever you can.' 'It's hard to know exactly where this nets out over time, but in the next few years, we expect that this will reduce our total corporate workforce as we get efficiency gains from using AI extensively across the company,' he said. Amazon is the largest private US employer after Walmart, with 1.56 million employees as of the end of March. Most work in warehouses packing and shipping items, but about 350,000 of them have corporate jobs. BLOOMBERG Join ST's Telegram channel and get the latest breaking news delivered to you.

‘Won't get annoyed, won't snap': Indonesians tap AI for judgement-free emotional support, but risks abound
‘Won't get annoyed, won't snap': Indonesians tap AI for judgement-free emotional support, but risks abound

CNA

time5 hours ago

  • CNA

‘Won't get annoyed, won't snap': Indonesians tap AI for judgement-free emotional support, but risks abound

JAKARTA: Ahead of an extended family gathering, Nirmala (not her real name) found herself unusually anxious. The reason: Small talk that could spiral into interrogation. 'Sometimes I just don't know how to answer questions from relatives, and that stresses me out,' said Nirmala, 39, who asked to remain anonymous. In contrast, the generative artificial intelligence platform ChatGPT has been nothing but a source of comfort ever since Nirmala began using it as a sounding board last October. 'It's not that I don't have anyone to talk to,' Nirmala told CNA Indonesia. 'But when I bring up things that people think are trivial, I'm often told I'm being dramatic. So I talk to AI instead – at least it listens without throwing judgement.' Like Nirmala, overseas student Ila (not her real name) has turned to AI-driven chatbots for advice. Ila, 35, first turned to ChatGPT in April 2023 when she was preparing to move abroad for further studies. She later began also using Chinese AI platform DeepSeek. At first, Ila – who also requested anonymity – used the platforms for practical information about university life and daily routines in her host country, which she declined to reveal. 'Before leaving for school, I had a ton of questions about life abroad, especially since I had to bring my children with me. AI became one of the ways I could gain perspective, aside from talking directly with people who'd already been through it,' she said. The platforms' replies put her at such ease that in October last year, she began sharing her personal issues with the chatbots. NO JUDGEMENT FROM CHATBOTS AI chatbots have taken the world by storm in recent years and more people are turning to them for mental health issues. Indonesia is no different. An online survey in April by branding and data firm Snapcart found that 6 per cent of 3,611 respondents there are using AI "as a friend to talk to and share feelings with". Nearly six in 10 (58 per cent) of respondents who gave this answer said they would sometimes consider AI as a replacement for psychologists. People in Southeast Asia's largest economy are not necessarily turning to AI chatbots because they lack human friends, but because AI is available 24/7 and "listens" without judgement, users and observers told CNA Indonesia. The tool, they said, is especially handy in a country with a relatively low number of psychologists. According to the Indonesian Clinical Psychologists Association, the country has 4,004 certified clinical psychologists, of whom 3,084 are actively practising. With a population of about 280 million people, this translates to about 1.43 certified clinical psychologists per 100,000 population. In comparison, neighbouring Singapore has 9.7 psychologists per 100,000 population – a ratio that is already lower than in other Organisation for Economic Cooperation and Development nations. The potential benefits of using AI in mental health are clear, experts said, even as risks and the need for regulation exist. The rise of AI as a trusted outlet for emotional expression is closely tied to people's increasingly digital lives, said clinical psychologist Catarina Asthi Dwi Jayanti from Santosha Mental Health Centre in Bandung. AI conversations can feel more intuitive for those who grew up with texting and screens, she said, adding that at least a dozen clients have told her they have consulted AI. "For some people, writing is a way to organise their thoughts. AI provides that space, without the fear of being judged," she said. Conversing with ChatGPT is a safe way of rehearsing her thoughts before opening up to somebody close to her, Nirmala said. "Honestly it doesn't feel like I'm talking to a machine. It feels like a conversation with someone who gets me," she said. AI chatbots offer accessibility, anonymity, and speed, said telecommunications expert Heru Sutadi, executive director of the Indonesia ICT Institute. AI platforms, he said, are "programmed to be neutral and non-critical". "That's why users often feel more accepted, even if the responses aren't always deeply insightful," he said. Unlike a session with a psychologist, "you can access AI 24/7, often at little to no cost", Heru said. "Users can share as much as they want without the pressure of social expectations. And best of all, AI replies instantly." In Indonesia, an in-person session with a private psychologist can cost upwards of 350,000 rupiah (US$21.50). Popular telemedicine platform Halodoc offers psychiatrist consultations at prices starting from 70,000 rupiah, while mental health app Riliv offers online sessions with a psychologist at prices starting from 50,000 rupiah. Another advantage of a chatbot, said Ila, is that it "won't get annoyed, won't snap, won't have feelings about me bombarding it with a dozen questions". "That's not the case when you're talking to a real person," she added. As such, AI can serve as a "first safe zone" before someone seeks professional help, especially when dealing with topics such as sexuality, religion, trauma or family conflict, said Catarina. "The anonymity of the internet, and the comfort that comes with it, allows young people to open up without the fear of shame or social stigma," she explained. Some of her clients, she added, turned to AI because they "felt free to share without worrying what others, including psychologists, might think of them, especially if they feared being labelled as strange or overly emotional." RISKS AND IMPACT ON REAL-LIFE RELATIONSHIPS But mental health professionals are just as wary of the risks posed by AI chatbots, citing issues such as privacy, regulation of the technology and their impact on users' real-life interactions with others. The machines can offer a false sense of comfort, Heru said. "The perceived empathy and safety can be misleading. Users might think AI is capable of human warmth when, in reality, it's just an algorithm mimicking patterns." Another major concern is data privacy, Heru said. Conversations with AI are stored on company servers and if cyber breaches occur, "sensitive data could be leaked, misused for targeted advertising, profiling, or even sold to third parties". For its part, Open AI, ChatGPT's parent company, has said: "We do not actively collect personal information to train our models, do not use public internet data to profile individuals, target advertising, or sell user data." Indonesia released a National Strategy for Artificial Intelligence in 2020, but the document is non-binding. AI is currently governed loosely under the 2008 Electronic Information and Transactions (ITE) Law and the 2022 Personal Data Protection Law, both of which touch on AI but lack specificity. A Code of Ethics for AI was issued by the Ministry of Communication and Digital Affairs in 2023, but its guidelines remain vague. In January this year, Communication and Digital Affairs Minister Meutya Hafid announced comprehensive AI regulations would be rolled out. Studies are also emerging on the impact of chatbot usage on users' real-life social interactions. In a 2024 study involving 496 users of the chatbot Replika, researchers from China found that greater use of AI chatbots, and satisfaction with them, could negatively affect a person's real-life interpersonal skills and relationships. Child and adolescent clinical psychologist Lydia Agnes Gultom from Klinik Utama dr. Indrajana said AI-based relationships are inherently one-sided. Such interactions could hinder people's abilities to empathise, resolve conflicts, assert themselves, negotiate or collaborate, she said. "In the long run, this reduces exposure to genuine social interaction," said Agnes. In other countries, experts have highlighted the need for guardrails on the use of AI chatbots for mental health. As these platforms tend to align with and reinforce users' views, they may fail to challenge dangerous beliefs and could potentially drive vulnerable individuals to self-harm, the American Psychological Association told US regulators earlier this year. Safety features introduced by some companies, such as disclaimers that the chatbots are not "real people", are also inadequate, the experts said. AI can complement the work of mental health professionals, experts told CNA Indonesia. It can offer initial emotional support and a space for humans to share and explore their feelings with the right prompts, said Catarina of Santosha Mental Health Centre. But when it comes to diagnosis and grasping the complexity of human emotions, AI still falls short, she said. "It lacks interview (skills), observation and a battery of assessment tools." AI cannot provide proper intervention in emergency situations such as suicide ideation, panic attacks or abuse, said Agnes of Klinik Utama dr. Indrajana, a healthcare clinic in Jakarta. Therapeutic relationships rooted in trust, empathy, and nonverbal communication can only happen between humans, she added.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store