logo
#

Latest news with #AIAscent

AI isn't ready to be your therapist, but it's a top reason people use it
AI isn't ready to be your therapist, but it's a top reason people use it

The Star

time26-05-2025

  • Business
  • The Star

AI isn't ready to be your therapist, but it's a top reason people use it

From falling in love with ChatGPT to deepfakes of deceased loved ones, artificial intelligence's potential for influence is vast – its myriad potential applications not yet completely charted. In truth, today's AI users are pioneering a new, still swiftly developing technological landscape, something arguably akin to the birth of social media in the early 2000s. Yet, in an age of uncertainty about nascent generative AI's full potential, people are already turning to artificial intelligence for major life advice. One of the most common ways people use generative AI in 2025, it turns out, is for therapy. But the technology isn't ready yet. How people use AI in 2025 As of January 2025, ChatGPT topped the list of most popular AI tools based on monthly site visits with 4.7 billion monthly visitors, according to Visual Capitalist. That dwarfed the next most popular service, Canva, more than five to one. When it comes to understanding AI use, digging into how ChatGPT is being put to work this year is a good starting point. Sam Altman, CEO of ChatGPT's parent company, OpenAI, recently offered some insight into how its users are making the most of the tool by age group. 'Gross oversimplification, but like older people use ChatGPT as a Google replacement,' Altman said at Sequoia Capital's AI Ascent event a few weeks ago, as transcribed by Fortune. 'Maybe people in their 20s and 30s use it as like a life advisor, and then, like people in college use it as an operating system.' It turns out that life advice is something a lot of AI users may be seeking these days. Featured in Harvard Business Review, author and co-founder Marc Zao-Sanders recently completed a qualitative study on how people are using AI. 'Therapy/companionship' topped the list as the most common way people are using generative AI, followed by life organisation and then people seeking purpose in life. According to OpenAI's tech titan, it seems that generated life advice can be an incredibly powerful influence. A Pew Research Center survey published last month reported that a 'vast majority' of surveyed AI experts said people in the United States interact with AI several times a day, if not almost constantly. Around a third of surveyed US adults said they had used a chatbot (which would include things like ChatGPT) before. Some tech innovators, including a team of Dartmouth researchers, are leaning into the trend. Therabot, can you treat my anxiety? Dartmouth researchers have completed a first-of-its-kind clinical trial on a generative AI-powered therapy chatbot. The smartphone app-friendly Therabot has been in development since 2019, and its recent trial showed promise. Just over 100 patients – each experiencing depressive disorder, generalized anxiety disorder or an eating disorder – participated in the experiment. According to senior study author Nicholas Jacobson, the improvement in each patient's symptoms was comparable to traditional outpatient therapy. 'There is no replacement for in-person care, but there are nowhere near enough providers to go around,' he told the college. Even Dartmouth's Therabot researchers, however, said generative AI is simply not ready yet to be anyone's therapist. 'While these results are very promising, no generative AI agent is ready to operate fully autonomously in mental health where there is a very wide range of high-risk scenarios it might encounter,' first study author Michael Heinz told Dartmouth. 'We still need to better understand and quantify the risks associated with generative AI used in mental health contexts.' Why is AI not ready to be anyone's therapist? RCSI University of Medicine and Health Sciences' Ben Bond is a PhD candidate in digital psychiatry who researches ways digital tools can be used to benefit or better understand mental health. Writing to The Conversation, Bond broke down how AI therapy tools like Therabot could pose some significant risks. Among them, Bond explained that AI 'hallucinations' are known flaws in today's chatbot services. From quoting studies that don't exist to directly giving incorrect information, he said these hallucinations could be dangerous for people seeking mental health treatment. 'Imagine a chatbot misinterpreting a prompt and validating someone's plan to self-harm, or offering advice that unintentionally reinforces harmful behaviour,' Bond wrote. 'While the studies on Therabot and ChatGPT included safeguards – such as clinical oversight and professional input during development – many commercial AI mental health tools do not offer the same protections.' According to Michael Best, PhD, a psychologist and contributor to Psychology Today, there are other concerns to consider, too. 'Privacy is another pressing concern,' he wrote to Psychology Today. 'In a traditional setting, confidentiality is protected by professional codes and legal frameworks. But with AI, especially when it's cloud-based or connected to larger systems, data security becomes far more complex. 'The very vulnerability that makes therapy effective also makes users more susceptible to harm if their data is breached. Just imagine pouring your heart out to what feels like a safe space, only to later find that your words have become part of a data set used for purposes you never agreed to.' Best added that bias is a significant concern, something that could lead to AI therapists giving bad advice. 'AI systems learn from the data they're trained on, which often reflect societal biases,' he wrote. 'If these systems are being used to deliver therapeutic interventions, there's a risk that they might unintentionally reinforce stereotypes or offer less accurate support to marginalized communities. 'It's a bit like a mirror that reflects the world not as it should be, but as it has been – skewed by history, inequality, and blind spots.' Researchers are making progress in improving AI therapy services. Patients suffering from depression experienced an average 51% reduction in symptoms after participating in Dartmouth's Therabot experiment. For those suffering from anxiety, there was an average 31% drop in symptoms. The patients suffering from eating disorders showed the lowest reduction in symptoms but still averaged 19% better off than before they used Therabot. It's possible there's a future where artificial intelligence can be trusted to treat mental health, but – according to the experts – we're just not there yet. – The Atlanta Journal-Constitution/Tribune News Service

Google chief scientist says AI could rival junior coders by 2026
Google chief scientist says AI could rival junior coders by 2026

India Today

time19-05-2025

  • Business
  • India Today

Google chief scientist says AI could rival junior coders by 2026

AI might be getting close to doing the job of a junior software engineer, according to Jeff Dean, Google's chief scientist. During Sequoia Capital's AI Ascent event, Dean said it may only be 'about a year-ish' before artificial intelligence reaches the skill level of an entry-level coder. While that may sound futuristic, it's a view that aligns with what many tech leaders have been saying as AI continues to improve rapidly, especially in tasks like coding. At a time when jobs in the field of tech are already scarce and competition is fierce, especially for fresh graduates, this could mean even more pressure on entry-level comments come as generative AI tools like ChatGPT, GitHub Copilot, and Google's own Gemini become increasingly common among developers. These tools are already being used to automate repetitive coding tasks, offer real-time suggestions, and even generate full blocks of code. But Dean also noted that writing code in an IDE (integrated development environment) is only one part of what junior engineers do.'This hypothetical virtual engineer probably needs a better sense of many more things than just writing code in an IDE,' Dean said in the Business Insider report. 'It needs to know how to run tests, debug performance issues, and all those kinds of things.' When asked how AI might eventually learn these broader engineering skills, Dean compared it to how real people gain experience — by learning tools, studying documentation, and learning from more experienced colleagues. 'We know how human engineers do those things,' he explained. 'They learn how to use various tools that we have, and can make use of them to accomplish that. And they get that wisdom from more experienced engineers, typically, or reading lots of documentation.'advertisementDean believes AI could do something similar — trying out solutions in virtual environments, learning from documentation, and improving over time. 'I feel like a junior virtual engineer is going to be pretty good at reading documentation and sort of trying things out in virtual environments,' he said. 'That seems like a way to get better and better at some of these things.'While he didn't say how far this could eventually go, Dean thinks it's going to make a meaningful difference. 'I don't know how far it will take us, but it seems like it'll take us pretty far,' he has not responded to Business Insider's request for comment at the time of publication.

Google chief scientist predicts AI could perform at the level of a junior coder in a year
Google chief scientist predicts AI could perform at the level of a junior coder in a year

Business Insider

time19-05-2025

  • Business
  • Business Insider

Google chief scientist predicts AI could perform at the level of a junior coder in a year

Jeff Dean, Google's chief scientist, thinks that AI will soon be able to replicate the skills of a junior software engineer. "Not that far," he said during Sequoia Capital's " AI Ascent" event, when asked how far AI was from being on par with an entry-level engineer. "I will claim that's probably possible in the next year-ish." Plenty of tech leaders have made similar predictions as models have continued to improve at coding, and AI tools become increasingly popular among programmers. With sweeping layoffs across the tech industry, entry-level engineers are already fielding intense competition — only to see it compounded by artificial intelligence. Still, Dean said, AI has more to learn beyond the basics of programming before it can produce work at the level of a human being. "This hypothetical virtual engineer probably needs a better sense of many more things than just writing code in an IDE," he said. "It needs to know how to run tests, debug performance issues, and all those kinds of things." As for how he expects it to acquire that knowledge, Dean said that the process won't be entirely unlike that of a person trying to gain the same skills. "We know how human engineers do those things," he said. "They learn how to use various tools that we have, and can make use of them to accomplish that. And they get that wisdom from more experienced engineers, typically, or reading lots of documentation." Research and experimentation is key, he added. "I feel like a junior virtual engineer is going to be pretty good at reading documentation and sort of trying things out in virtual environments," Dean said. "That seems like a way to get better and better at some of these things." Dean also said the impact "virtual" engineers will likely be significant. "I don't know how far it will take us, but it seems like it'll take us pretty far," he said.

College students are using ChatGPT as their personal life coach — 5 prompts to try it (even if you're not Gen Z)
College students are using ChatGPT as their personal life coach — 5 prompts to try it (even if you're not Gen Z)

Tom's Guide

time14-05-2025

  • Tom's Guide

College students are using ChatGPT as their personal life coach — 5 prompts to try it (even if you're not Gen Z)

College students are finding new ways to use ChatGPT that go beyond writing essays and brainstorming ideas. According to OpenAI CEO Sam Altman, students are relying on chatbots for so much more than homework a recent video taken at Sequoia Capital's AI Ascent event, Altman explained that Gen Z is using ChatGPT to make real decisions about their goals, relationships, routines and their future as a whole. Unlike Google, ChatGPT gives advice in full sentences. It can remember context, offer suggestions without judgment and talk through your uncertainty like a calm, 24/7 sounding I've used ChatGPT to tackle everything from project summaries to panic attacks, and have found it to be a great sounding board when facing tough choices. While it's no substitute for human guidance or a therapist, ChatGPT can be a great assistant in a pinch. If you're interested in trying it for yourself, here are the 5 best prompts to turn ChatGPT into your own personal life coach. Prompt: 'Help me make a decision between two things. Ask me questions first.' Why it works: ChatGPT shines when you give it room to think with you. In other words, speak to it as you would a friend, not like a Google search. The chatbot will ask clarifying questions, help you weigh pros and cons and even flag emotional bias. AI can often help in ways when human emotion clouds judgement. As a bot, it doesn't have feelings and won't judge you even when the decision is life-changing. Prompt: 'I'm overwhelmed and don't know where to start. Can you help me prioritize my day?' Get instant access to breaking news, the hottest reviews, great deals and helpful tips. Why it works: When you don't know what to do and just want to throw your hands up, ChatGPT can offer some structure. You'll get a response that's often more realistic than your own to-do list. It's like an assistant when every day choices are bogging you down. Prompt: 'I'm thinking about switching majors / jobs / cities. Talk me through the decision.' Why it works: The longer you use ChatGPT (signed-in under one account), the better help it can be. The chatbot can remember conversations you've had, perhaps recall difficulties that may influence new decisions. It can also surface long-term vs. short-term consequences that you may not have considering, which can help you get clear on what matters most, without pushing you in one direction. Prompt: "I need to have a hard conversation with someone. Can you help me script it?' Why it works: From awkward roommate dynamics to asking your professor for an extension without sounding like you're making up an excuse, ChatGPT can help you phrase things in a way that's firm but still friendly. Honestly, I wish I'd had this in college — maybe it could've helped me gently explain to my roommate that midnight is not the ideal time for full-volume karaoke. Prompt: 'Give me 3 questions to reflect on when I feel burned out.' Why it works: These types of prompts help the model act as a vibe check for the way a college student is feeling emotionally and physically. Being focused on school, work, relationships and more can fog how someone's mental clarity. This is a quick check-in that can spot possible red flags. Is ChatGPT a licensed therapist or professional coach? No. But is it surprisingly good at giving calm, helpful and sometimes eye-opening advice? Absolutely. If you've ever felt stuck in your own head, overwhelmed by choices, or just need someone to 'talk it through' with, ChatGPT might be the easiest life coach you'll never have to book.

OpenAI CEO Sam Altman says Gen Z and millennials are using ChatGPT like a ‘life adviser'—but college students might be one step ahead
OpenAI CEO Sam Altman says Gen Z and millennials are using ChatGPT like a ‘life adviser'—but college students might be one step ahead

Yahoo

time14-05-2025

  • Business
  • Yahoo

OpenAI CEO Sam Altman says Gen Z and millennials are using ChatGPT like a ‘life adviser'—but college students might be one step ahead

OpenAI CEO Sam Altman said different generations use ChatGPT in different ways. Younger people tend to use it more as an adviser, while older generations use it as a replacement for a search tool, like Google. Experts are divided on whether it's safe to use LLMs for advice. As ChatGPT becomes more sophisticated, its practical use cases grow. And as it turns out, different generations use the product differently, according to OpenAI CEO Sam Altman. 'Gross oversimplification, but like older people use ChatGPT as a Google replacement. Maybe people in their 20s and 30s use it as like a life adviser, and then, like people in college use it as an operating system,' Altman said at Sequoia Capital's AI Ascent event earlier this month. Venture-capital firm Sequoia first invested in OpenAI in 2021 when the company was valued at $14 billion. Currently, OpenAI is valued at $300 billion after one of the largest-ever private funding rounds. Sequoia has also invested in other tech giants like Nvidia, Reddit, Instacart, YouTube, Apple, Dropbox, Airbnb, and Doordash. Altman said young people use ChatGPT similar to how they'd use an operating system. They have complex ways to set it up and connect it to files and have fairly complex prompts memorized or saved somewhere. 'I mean, that stuff, I think, is all cool and impressive,' Altman said. 'And there's this other thing where, like, they don't really make life decisions without asking ChatGPT what they should do.' Earlier this year, OpenAI published a report saying 'more than any other use case, more than any other kind of user, college-aged young adults in the U.S. are embracing ChatGPT, adding that more than one-third of 18-to-24 year olds use ChatGPT. Younger users are able to do this since ChatGPT has memory of previous conversations the user has had with the AI product. 'It has the full context on every person in their life and what they've talked about,' Altman said. Reports show people have started using ChatGPT for anything ranging from relationship advice to business and medical questions. Others use it as a replacement for talk therapy. Meanwhile, experts in those respective fields are torn on whether it's safe and advisable to consult ChatGPT for major life decisions. For example, a November 2023 study 'highlights the need for caution when using ChatGPT for safety-related information and expert verification, as well as the need for ethical considerations and safeguards to ensure users understand the limitations and receive appropriate advice.' Another study said large language models, like ChatGPT, are 'inherently sociopathic,' making it difficult to trust their advice. Other studies and experiments, however, show using ChatGPT for common advice to be harmless—and even helpful in some cases. OpenAI didn't immediately respond to Fortune's request for comment about whether it's safe or reliable to use ChatGPT for advice. 'The difference is unbelievable' in how a 20-year-old might use ChatGPT versus older generations, Altman said during the Sequoia talk. 'It reminds me of, like, when the smartphone came out, and, like, every kid was able to use it super well,' Altman said. 'And older people, just like, took, like, three years to figure out how to do basic stuff.' This story was originally featured on

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store