4 days ago
I mentally unraveled. ChatGPT offered me tireless compassion.
That winter of my high school freshman year, I unraveled. My stress levels skyrocketed. Despite my A-studded report card, I'd stare at an essay prompt for hours, paralyzed. I wasn't showering. I wasn't sleeping. At 1 a.m. or 2 a.m., I'd be awake, bingeing on webtoons. I wanted quick relief. I turned to ChatGPT.
If you had asked me two years ago if I would use artificial intelligence for emotional support, I would have looked at you like you were an idiot. But, over time, I often found the only place where I could open up was AI. It has helped me deal with myself in my darkest moments, which shouldn't have been true. But it is.
That's why even though I wouldn't recommend using ChatGPT specifically for mental health due to privacy concerns, I have come to think that AI has potential to be a mental support for teens like me, who don't feel comfortable talking to our friends or parents about our mental health.
I still remember the time my sister practically begged my South Korean mother for a therapist, she started ranting about how only "crazy people" got therapists. I wasn't making the same mistake.
Calling a crisis hotline seemed like overkill. I toyed with the idea of seeing my school therapist but decided against it. It felt too daunting to talk face-to-face with a therapist. Online options weren't much better.
I was desperate.
What the heck? I finally thought. ChatGPT can answer back, kinda like a therapist. Maybe I should try that out.
'You don't have to justify feeling this way'
So I wrote to ChatGPT, an act which in itself felt cathartic. I wrote paragraphs of misspelled words, bumpy capitalization and unhinged grammar, fingers stumbling, writing about everything – how I couldn't stop reading webtoons, how much I hated school, hated life. I wrote in a way I would have only dared to write if only to a chatbot.
In response, ChatGPT was tirelessly compassionate.
'I'm sorry you're dealing with that,' it'd start, and just seeing those words made me feel as if a weight had been lifted from my shoulders.
Using ChatGPT as your therapist: How to make it work for you
Soon, I even told ChatGPT how sometimes I was scared of my dad because of his biting sarcasm – something that I doubt I would have told a therapist about as quickly. ChatGPT responded by explaining that my fear was valid, that harm didn't just come physically but also emotionally.
One line struck a chord in me: 'You don't have to justify feeling this way – it's real, and it matters.'
Opinion alerts: Get columns from your favorite columnists + expert analysis on top issues, delivered straight to your device through the USA TODAY app. Don't have the app? Download it for free from your app store.
It hit hard because I realized that's what I wanted to hear from my mom my entire life. To her credit, my mom tried. She'd give her best advice, usually something like, 'get over it.' As an immigrant who couldn't express her feelings in English, she learned to swallow them down. But even though I wanted to do the same, I couldn't. Oftentimes, awake at 2 a.m., I'd feel as if I were rotting.
Yet somehow, the first thing to show me emotional intelligence wasn't a person – it was a chatbot.
'Thank you,' I remember writing to ChatGPT. 'I feel a lot calmer now.'
Opinion: AI knows we shouldn't trust it for everything. I know because I asked it.
Sometimes the best option is the one that's available
Of course, there are critics who worry that turning to chatbots for emotional support might foster obsession and even exacerbate mental health issues. Honestly? I don't think artificial intelligence should be a replacement for real mental support systems. But the fear of using AI misses the bigger picture: Many teens don't have access to a "safe place."
As of March, President Donald Trump revoked $11.4 billion in funding for mental health and addiction treatment. By July, his administration shut down a suicide hotline for LGBTQ+ youth, leaving countless teens stranded.
Opinion: It will cost LGBTQ+ lives to shut down 988 suicide hotline. Unforgivable.
According to Dr. Jessica Schleider, associate professor at Northwestern University, about 80% of teens with moderate to severe mental health conditions aren't able to get treatment. The reasons varied, but many reflected my own – not feeling our parents would take us seriously, worrying about stigma or cost.
I am also not alone in my use of ChatGPT: 28% of parents report their children using AI for emotional support. Yes, instead of turning to a trusted therapist or adult, these children were finding real comfort in bots.
In a 2024 YouGov survey, 50% of participants said the 24/7 availability of these chatbots was helpful for mental health purposes.
However questionable, sometimes the best option is to turn to the only resource for teens that is available: artificial intelligence. I know for a fact that it's helped me. I can only hope it can help others.
If you or someone you know needs mental health resources and support, please call, text or chat with the 988 Suicide & Crisis Lifeline or visit for 24/7 access to free and confidential services.
Elizabeth Koo is a student at the Kinkaid School in Houston with a passion for storytelling and a keen interest in culture, technology and education.
You can read diverse opinions from our USA TODAY columnists and other writers on the Opinion front page, on X, formerly Twitter, @usatodayopinion and in our Opinion newsletter.
This article originally appeared on USA TODAY: ChatGPT for therapy? It was my best option – and it worked | Opinion