Latest news with #Youper


Forbes
29-04-2025
- Health
- Forbes
AI Therapists Are Here: 14 Groundbreaking Mental Health Tools You Need To Know
There are many fields where generative AI is proving to have truly transformative potential, and some of the most interesting use cases are around mental health and wellbeing. While it can't provide the human connection and intuition of a trained therapist, research has shown that many people are comfortable sharing their worries and concerns with relatively faceless and anonymous AI bots. Whether this is always a good idea or not, given the black-box nature of many AI platforms, is up for debate. But it's becoming clear that in specific use cases, AI has a role to play in guiding, advising and understanding us. So here I will look at some of the most interesting and innovative generative AI tools that are reshaping the way we think about mental health and wellbeing today. Headspace Headspace is a hugely popular app that provides calming mindfulness and guided meditation sessions. Recently, it's expanded to become a full digital mental healthcare platform, including access to therapists and psychiatric services, as well as generative AI tools. Their first tool is Ebb, designed to take users on reflective meditation experiences. Headspace focused heavily on the ethical implications of introducing AI to mental healthcare scenarios when creating the tool. This is all part of their mission to make digital mindfulness and wellness accessible to as many people as possible through dynamic content and interactive experiences. Wysa This is another very popular tool that's widely used by corporate customers to provide digital mental health services to employees, but of course, anyone can use it. Its AI chatbot provides anonymous support and is trained in cognitive behavioral therapy, mindfulness and dialectical behavioral therapy and mindfulness. Wysa's AI is built from the ground up by psychologists and tailored to work as part of a structured package of support, which includes interventions from human wellbeing professionals. Another standout is the selection of features tailored to helping young people. Wysa is one of the few mental health and wellbeing AI platforms that holds the distinction of being validated clinically in peer-reviewed studies. Youper This platform is billed as an emotional health assistant and uses generative AI to deliver conversational, personalized support. It blends natural language chatbot functionality with clinically validated methods including CBT. According to its website, its effectiveness at treating six mental health conditions, including anxiety and depression, has been confirmed by Stanford University researchers, and users can expect benefits in as little as two weeks. Mindsera This is an AI-powered journaling app designed to help users manage their mental health by providing insights and emotional analytics based on their writing. It provides users with a number of journaling frameworks as well as guidance from AI personas in the guise of historical figures. It aims to help users get to the bottom of the emotional drivers behind their thought processes and explore these through the process of writing and structuring their thoughts. Chatbot functionality means that journaling becomes a two-way process, with the AI guiding the user towards different pathways for exploring their mental wellbeing, depending on how and what they write about. Mindsera can even create images and artwork based on users' journaling, to give new perspectives on their mental health and wellbeing. Woebot Woebot is a 'mental health' ally chatbot that helps users deal with symptoms of depression and anxiety. It aims to build a long-term, ongoing relationship through regular chats, listening and asking questions in the same way as a human therapist. Woebot mixes natural-language-generated questions and advice with crafted content and therapy created by clinical psychologists. It is also trained to detect 'concerning' language from users and immediately provides information about external sources where emergency help or interventions may be available. Woebot seems to be available only to Apple device users. The choice of tools and platforms dedicated to mental health and wellbeing is growing all the time. Here are some of the other top choices out there: Calm Alongside Headspace (see above), Calm is one of the leading meditation and sleep apps. It now uses generative AI to provide personalized recommendations. Although this is not a dedicated mental health app, therapists and psychologists are among the AI characters this platform offers, and both are available free of charge 24/7. EmoBay Your 'psychosocial bestie', offering emotional support with daily check-ins and journaling. HeyWellness This platform includes a number of wellness apps, including HeyZen, designed to help with mindfulness and calm. Joy Joy is an AI virtual companion that delivers help and support via WhatsApp chat. Kintsugi Takes the innovative approach of analyzing voice data and journals to provide stress and mental health support. Life Planner This is an all-in-one AI planning and scheduling tool that includes functions for tracking habits and behaviors in order to develop healthy and mindful routines. Manifest This app bills itself as 'Shazam for your feelings' and is designed with young people in mind. Reflection Guided journaling app that leverages AI for personalized guidance and insights. Resonance AI-powered journaling tool developed by MIT, which is designed to work with users' memories to suggest future paths and activities. Talking therapies like CBT have long been understood to be effective methods of looking after our mental health, and AI chatbots offer a combination of accessibility and anonymity. As AI becomes more capable and deeply interwoven with our lives, I predict many more will explore its potential in this field. Of course, it won't replace the need for trained human therapists any time soon. However, AI will become another tool in their box that they can use to help patients take control of their mental wellbeing.

News.com.au
26-04-2025
- Health
- News.com.au
People are turning to AI apps like Chat GPT for therapy
Cast your mind back to the first time you heard the phrase, 'Google it.'. Early to mid 2000s, maybe? Two decades later, 'Googling' is swiftly being replaced by 'Ask ChatGPT.' ChatGPT, OpenAI's groundbreaking AI language model, is now having anything and everything thrown at it, including being used as a pseudo-therapist. Relationship issues, anxiety, depression, mental health and general wellbeing – for better or worse, ChatGPT is being asked to do the heavy lifting on all of our troubles, big and small. This is a big ask from what was infamously labelled a 'bullshit machine' by Ethics and IT researchers last year. The role of AI in mental health support A recent report from OpenAI showed how people were using the tool, which included health and wellbeing purposes. As artificial intelligence is accepted into our lives as a virtual assistant, it is not surprising that we are divulging our deepest thoughts and feelings to it, too. There are a variety of therapy apps built for this specific purpose. Meditation app Headspace has been promoting mindfulness for over a decade. But with the rise of AI over the last few years, AI-powered therapy tools are now abound, with apps such as Woebot Health, Youper and Wysa gaining popularity. It's easy to pick on these solutions as gimmicks at best and outright dangerous at worst. But in an already stretched mental healthcare system, there is potential for AI to fill the gap. According to the Australian Bureau of Statistics, over 20 per cent of the population experience mental health challenges every year, with that number continuing to trend upwards. When help is sought, approaches which rely on more than face-to-face consultations are needed to pick up the slack in order to meet demand. Public perception of AI therapy apps The prevalence and use of AI therapy apps suggest there is a shift in the public perception of using tech to support mental health. AI also creates a lower barrier to entry. It allows users to try these tools without needing to overcome the added fear or perceived stigma of seeing a therapist. Which comes with its own challenges, notably lack of oversight of the conversations taking place on these platforms. There is a concept in AI called human-in-the-loop. It embeds a real life professional into AI-driven workflows, ensuring that somebody is validating the outputs. This is an established concept in AI, but one which is being skipped over more and more for pure automation. Healthcare generally has human-in-the-loop feedback systems built into it – for example, a diagnosis is double checked before action is taken. Strict reliance on AI apps alone typically skips this part of the process. The risks of replacing human therapists with technology The fact is we are asking important questions of something that does not have genuine, lived experience. For a start, OpenAI states that ChatGPT has not been designed to replace human relationships. Yet language models are general purpose tools, and users will inevitably find a way to put them to work in new and unexpected ways. There are few conversational limits in place and it is available to users all day, every day. Combine that with its natural communication style, its neutral emotional register and ability to simulate human interaction – treating it as a confidant is a logical development. But it is important to remember: whatever wisdom it imparts is a pastiche of training data and internet sources. It cannot truly know if its advice is good or bad – it could be convincingly argued that it also does not care if it is giving you the right advice. Let's push this thought further: AI does not care about your wellbeing. Not really. It will not follow up if you don't show up on schedule, nor will it alert carers or the authorities if it believes something is wrong. We get into even pricklier territory when we recall that AI wants you to respond well to it, which increases user preference ratings and keeps you coming back for more. This is where living, breathing therapists are key. Their gut instincts are not running on any definable algorithm. They use their knowledge of a patient and their years of training and experience in the field to formulate care plans and appropriate responses if things are going off track. 'The risk is that people see new tech as a panacea,' says Macquarie University Psychological Sciences Professor Nick Titov. 'But we are working with very vulnerable people. We have a responsibility and duty of care.' Titov is Executive Director of Mindspot, a digital psychology platform funded by the Australian Government. The free service seeks to remove obstacles when needing to access mental health support. K ey to the platform is the ability for people to access real, qualified therapists. 'Whether it's our mental health or general health, use cases will always differ, and so there are nuances which must be considered. Tech alone is not an end to end solution.' Real vs. simulated care So, while AI support might not be 'real', does the distinction actually matter if the user feels cared for? As long as users feel AI solves or alleviates their immediate concerns, it will continue to be used. But the majority of people seeking AI-driven therapy will turn to largely unmonitored platforms – including tools like ChatGPT, which were not purpose-built. One promising approach mixes the supervision of real-life professionals with AI. Australia-based clinical psychologist Sally-Anne McCormack developed ANTSA, an AI therapy platform which gives therapists visibility of conversations their clients have with the AI-powered chatbot. 'I see AI as a support tool,' McCormack says. 'But with most apps, you don't know what the AI is saying, and you don't know what the clients are saying back. 'I couldn't believe nobody was overseeing it.' The app provides users with prompts and recommendations, but does so under the watchful eye of their treating practitioner. 'We make it clear to users that you are speaking to AI. It is not a real person and your practitioner may view the conversation,' she said. 'Even so, clients are telling our AI things they have never told their practitioners. In that moment, there's no judgement.' Convenience, availability, lack of judgement – all of these are factors in people using AI for everyday tasks. Just as 'Google it' reshaped how we seek information, 'Ask ChatGPT' is reshaping how we build a spreadsheet, create stories, seek advice – and ultimately navigate this thing called life. But maybe mental health support demands something fundamentally more human. The ongoing challenge will be deciding precisely how AI and human expertise come together.


Forbes
22-04-2025
- Forbes
AI's Shocking Pivot: From Work Tool To Digital Therapist And Life Coach
It's been just over two years since the launch of ChatGPT kickstarted the generative AI revolution. In that short time, we've seen it evolve to become a powerful and truly useful business tool. But the ways it's being used might come as a surprise. When we first saw it, many of us probably assumed that it would mainly be used to carry out creative and technical tasks on our behalf, such as coding and writing content. However, a recent survey reported in Harvard Business Review suggests this isn't the case. Rather than doing our work for us, the majority of users are looking to it for support, organization, and even friendship! Topping the list of use cases, according to the report, is therapy and companionship. This suggests that its 24/7 availability and ability to offer anonymous, honest advice and feedback is highly valued. On the other hand, marketing tasks—such as blog writing, creating social media posts or advertising copy—appear far lower down the list of popular uses. So why is this? Let's take a look at what the research shows and what it could imply about the way we as humans will continue to integrate AI into our lives and work. One thing that's clear is that although generative AI is quite capable of doing work for us while we put our feet up and relax, many prefer to use it for generating ideas and brainstorming. This could simply come down to the quality of AI-generated material or even inbuilt bias in humans that deter us from wanting to consume robotic content. It's often noted that generative AI writing style can come across as very bland and formulaic. When asked, most people still say they would rather read content created by humans. Even if, in practice, we can't always tell the difference. As the report's author, Marc Zao-Sanders states, 'the top 10 genAI use cases in 2025 indicate a shift from technical to emotional applications, and in particular, growth in areas such as therapy, personal productivity and personal development.' After therapy and companionship, the most common uses for generative AI were "organizing my life," "finding purpose," and "enhancing learning." The first technical use case, 'creating code' ranked fifth on the list, followed by 'generating ideas'. This upends some seemingly common-sense assumptions about how society would adopt generative AI, suggesting it will be used in more reflective, introspective ways than was at first predicted. In particular, therapeutic uses topping the list may seem surprising. But when we consider that worldwide, there is a shortage of professionals trained to talk us through mental health challenges, it makes more sense. The survey's findings are supported by the wide range of emerging genAI applications designed for therapeutic use, such as Wysa, Youper and Woebot. A growing need to continuously learn and upskill in the face of technological advancement could also explain the popularity of using AI to enhance our education and professional development. Overall, these insights indicate that generative AI is being adopted into a broader range of facets of everyday life, rather than simply doing work that we don't want to do ourselves. The current trajectory of AI use suggests a future where AI is seen as a collaborative and supportive assistant, rather than a replacement for human qualities and abilities. This has important implications for the way it will be used in business. Adopting it for use cases that support human workers, rather than attempting to replace them, is likely to lead to happier, less stressed and ultimately more productive employees. There is already growing evidence that businesses see investing in AI-based mental health companions and chatbots as a way of mitigating the loss of productivity caused by stress and anxiety. As generative AI continues to evolve, we can expect it to become better at these types of tasks. Personalized wellness support, guided learning and education opportunities organizing workflows and brainstorming ideas are all areas where it can provide a huge amount of value to many organizations while removing anxiety that it is here to replace us or make us redundant. Understanding how AI is being used today is essential if we want to influence how it evolves in the future. While it's easy to imagine a world where robots take over all our tasks, the real opportunity lies in using AI to help us work more intelligently, collaborate more effectively, and support healthier, more balanced ways of working.