logo
#

Latest news with #iStockAI

AI companions and humans, a toxic relationship: Scientists warn of emotional dependency risks.
AI companions and humans, a toxic relationship: Scientists warn of emotional dependency risks.

Economic Times

time6 days ago

  • Economic Times

AI companions and humans, a toxic relationship: Scientists warn of emotional dependency risks.

iStock AI is now way beyond just a personal assistant on the keyboard, searching the precise answer over the internet, saving a bit of that extra time that could eventually come in handy. AI today is a friend, a teacher, a music instructor, a wizard that can bring the dead back to life. Anything and everything that you command it to be, it will be. As AI becomes more humanlike in its language, tone and presence, scientists and psychologists have pressed the red buzzer, wailing sirens about the growing side-effects: emotional dependency. According to a 2024 systemic review in "smart learning environments" long-term emotional bonds with AI companions can lead to psychological responses similar to that experienced when losing a real friend or a family member at that. AI Companions today come in many forms, some driven by chat-boxes like 'Replika' others through voice assistants or personalised avatars that can hold deep thought evoking conversations. As these tools become more and more advanced, users report not just affection but being emotionally reliant on their AI companion. The AI companion provides for a safe, non-judgemental space for the users, thereby making it inevitable that the users would open up to them in ways they wouldn't in general to real people. This may seem harmless, although upon there being a malfunction in the AI, the AI would disappear in a moment. As a result, the user becomes grief-stricken, the same way he/she would on losing a loved one. 'Replika' temporarily removed its 'romantic features' from its chat-box after regulatory pressure. The backlash was immensely emotional, users reported feeling shattered as though they had 'lost' a partner or a friend. Forums and SNS were flooded with grief posts about a digital entity. This form of dependency can also become a market risk, for AI firms offering companionship services working on a tightrope between user satisfaction, ethical AI design and psychological safety. Startups entering the emotional AI or 'grief tech' space must now consider mental health safeguards into their platforms. India, being a young digitally curious population with a growing mental awareness, is a fertile ground for both caution and innovation. AI offers a space that is non-judgemental, safe and easy to access, making it ideal for people to connect and disclose their deepest darkest secrets, but at what cost. If losing access to these AI chat-boxes can trigger despair-The rise of AI companions signifies a new era where technology transcends its traditional role, becoming sources of friendship and emotional support. However, the risk of emotional dependency looms, with users forging deep connections that can lead to heartbreak if the AI is no longer and shattering emotions, then it is time for the policy makers, investors and developers to question themselves: are we creating these AI companions to fill human errors or are we creating new errors, for humans to be more vulnerable? Disclaimer Statement: This content is authored by a 3rd party. The views expressed here are that of the respective authors/ entities and do not represent the views of Economic Times (ET). ET does not guarantee, vouch for or endorse any of its contents nor is responsible for them in any manner whatsoever. Please take all steps necessary to ascertain that any information and content provided is correct, updated, and verified. ET hereby disclaims any and all warranties, express or implied, relating to the report and any content therein.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store