logo
#

Latest news with #BethanieDrakeMaples

AI companion apps such as Replika need more effective safety controls, experts say
AI companion apps such as Replika need more effective safety controls, experts say

ABC News

time2 days ago

  • ABC News

AI companion apps such as Replika need more effective safety controls, experts say

The idea of having an emotional bond with a digital character was once a foreign concept. Now, "companions" powered by artificial intelligence (AI) are increasingly acting as friends, romantic partners, or confidantes for millions of people. With woolly definitions of "companionship" and "use" (some people use ChatGPT as a partner, for instance), it's difficult to tell exactly how widespread the phenomenon is. But AI companion apps Replika, Chai and each have 10 million downloads on the Google app store alone, while in 2018 Microsoft boasted its China-based chatbot XiaoIce had 660 million users. These apps allow users to build characters, complete with names and avatars, which they can text or even hold voice and video calls with. But do these apps fight loneliness, or are they supercharging isolation? And is there any way to tip the balance in the right direction? Romance and sexuality are big drawcards to the AI companion market, but people can have a range of other reasons for setting up a chatbot. They may be seeking non-judgemental listening, tutoring (particularly in their language skills), advice or therapy. Bethanie Drake-Maples, a researcher at Stanford University who studies AI companions, says some people also use the apps to reflect their own persona. "Some people will create a digital twin and just have a relationship with an externalised version of themselves," she tells ABC Radio National's series Brain Rot. Ms Drake-Maples published a study based on interviews with more than 1,000 students who used the AI companion app Replika. She and her colleagues found there were important benefits for some users. Most significantly, 30 of the interviewees said using the app had prevented them from attempting suicide. Many participants also reported the app helped them forge connections with other people, through things like advice on their relationships with other people, helping them to overcome inhibitions to connect with others, or by teaching them empathy. But other users reported no benefits, or negative experiences. Outside Ms Drake-Maples' study, AI companions have also been implicated in deaths. Ms Drake-Maples points out their study was a self-selecting cohort, and not necessarily representative of all Replika users. Her team is carrying out a longer-term study to see if they can glean more insights. But she believes it's possible these apps are, on the whole, beneficial for users. "We specifically wanted to understand whether or not Replika was displacing human relationship or whether it was stimulating human relationship," she says. But this social promotion can't be taken for granted. Ms Drake-Maples is concerned that companion apps could replace people's interactions with other humans, making loneliness worse. The participants in her study were much lonelier than the general population, although this isn't necessarily unusual for young college students. She believes governments should regulate AI companion technology to prevent this isolation. "There's absolutely money to be made by isolating people," she says. "There absolutely does need to be some kind of ethical or policy guidelines around these agents being programmed to promote social use, and not being programmed to try to isolate people." Replika says it's introduced a number of controls on its apps to make them safer, including a "Get Help" button that directs people to professional helplines or scripts based on cognitive behavioural therapy, and a message coding system that flags "unsafe" messages and responds in kind. Ms Drake-Maples thinks this is a good example for other apps to follow. "These things need to be mandated across the board," she says. Raffaele Ciriello, a researcher at the University of Sydney, is more sceptical of Replika's safety controls, saying they're "superficial, cosmetic fixes". He points out the controls were introduced months after the Italian government ruled the app had to stop using the data of Italian citizens in early 2023, citing concerns about age verification. "They were fearing a regulatory backlash." Dr Ciriello has also been interviewing and surveying AI companion users, and while he says some users have found benefits, the apps are largely designed for emotional dependence. "If you look at the way [Replika is] making money, they have all the incentives to get users hooked and dependent on their products," he says. Replika operates on a "freemium" model: a free base app, with more features (including the romantic partner option) available by paid subscription. Other companion apps follow the same model. "Replika and their kin have Silicon Valley values embedded in them. And we know what these look like: data, data, data, profit, profit, profit," Dr Ciriello says. Nevertheless, he also believes it's possible for AI companion technology to be built safer and more ethically. Companies that consult vulnerable stakeholders, embed crisis response protocols, and advertise their products responsibly are likely to create safer AI companions. Dr Ciriello says that Replika fails on several of these fronts. For instance, he calls its advertising "deceptive". The company badges its product as "the AI companion who cares". "[But] it's not conscious, it's not actually empathetic, it's not actually caring," Dr Cirello says. A Replika spokesperson said the tagline "the AI companion who cares" was "not a claim of sentience or consciousness." "The phrase reflects the emotionally supportive experience many users report, and speaks to our commitment to thoughtful, respectful design," they said. "In this regard, we are also working with institutions like the Harvard Human Flourishing Program and Stanford University to better understand how Replika impacts wellbeing and to help shape responsible AI development." Dr Ciriello says women-centred Australian app Jaimee is an example of an AI companion with better ethical design — although it faces the "same commercial pressures" as bigger apps in the market. The Californian Senate last week passed a bill regulating AI chatbots. If the bill continues through the legislature to become law, it will — among other things — require the companions to regularly remind users they're not human, and enforce transparency on suicide and crisis data. This bill is promising, Dr Cirello says. "If the history of social media taught us anything, I would rather have a national strategy in Australia where we have some degree of control over how these technologies are designed and what their incentives are and how their algorithms work." But, he adds, research on these apps is still in its infancy, and it will take years to understand their full impact. "It's going to take some time for that research to come out and then to inform sensible legislation." Listen to the full episode about the rise and risks of AI companions, and subscribe to the podcast for more.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store