logo
#

Latest news with #SandhiniAgarwal

OpenAI updates its system for evaluating AI risks
OpenAI updates its system for evaluating AI risks

Axios

time15-04-2025

  • Business
  • Axios

OpenAI updates its system for evaluating AI risks

OpenAI is making several changes to the system it uses to evaluate the risks new models pose, adding new categories for models that could self-replicate or conceal their capabilities. Why it matters: OpenAI uses its "preparedness framework" to decide whether AI models are safe and what, if any, safeguards are needed during development and for public release. Driving the news: In another change, OpenAI will no longer specifically evaluate models on their persuasive capabilities — an area where its recent models had already risen to "medium" risk level. The company is also doing away with distinguishing between "low" and "medium" risk and will focus on deciding whether or not risks reach the "high" or "critical" levels. In addition to continuing to monitor the risk that AI might be used to create bioweapons or gain a capacity for self-improvement, OpenAI is adding several new "research" categories — such as whether a model can conceal capabilities, evade safeguards or seek to replicate itself or prevent shutdowns. "We are on the cusp of systems that can do new science, and that are increasingly agentic — systems that will soon have the capability to create meaningful risk of severe harm," OpenAI said in the updated framework. "This means we will need to design and deploy safeguards we can rely on for safety and security." The changes are the first OpenAI has made to the framework since it was unveiled in December 2023. What they're saying: In an interview, OpenAI safety researcher Sandhini Agarwal told Axios the changes are designed to shift the company's efforts toward safeguards that protect against the most severe risks. "The purpose of the framework is to focus on catastrophic risks," she said. "This is not the be-all, end-all of safety at OpenAI." Between the lines: The new research categories align with broader industry discussion around the prospect that models might act differently in testing than in the real world and that they might try to conceal their capabilities.

Higher use of chatbots may lead to less socialising
Higher use of chatbots may lead to less socialising

Yahoo

time23-03-2025

  • Health
  • Yahoo

Higher use of chatbots may lead to less socialising

By Rachel Metz (Bloomberg) — Higher use of chatbots like ChatGPT may correspond with increased loneliness and less time spent socialising with other people, according to new research from OpenAI in partnership with the Massachusetts Institute of Technology. Those who spent more time typing or speaking with ChatGPT each day tended to report higher levels of emotional dependence on, and problematic use of, the chatbot, as well as heightened levels of loneliness, according to research released Friday. The findings were part of a pair of studies conducted by researchers at the two organisations and have not been peer reviewed. The launch of ChatGPT in late 2022 helped kick off a frenzy for generative artificial intelligence. Since then, people have used chatbots for everything from coding to ersatz therapy sessions. As developers like OpenAI push out more sophisticated models and voice features that make them better at mimicking the ways humans communicate, there is arguably more potential for forming parasocial relationships with these chatbots. In recent months, there have been renewed concerns about the potential emotional harms of this technology, particularly among younger users and those with mental health issues. Character Technologies Inc. was sued last year after its chatbot allegedly encouraged suicidal ideation in conversations with minors, including one 14-year-old who took his own life. San Francisco-based OpenAI sees the new studies as a way to get a better sense of how people interact with, and are affected by, its popular chatbot. 'Some of our goals here have really been to empower people to understand what their usage can mean and do this work to inform responsible design,' said Sandhini Agarwal, who heads OpenAI's trustworthy AI team and co-authored the research. To conduct the studies, the researchers followed nearly 1,000 people for a month. Participants had a wide range of prior experience with ChatGPT and were randomly assigned a text-only version of it or one of two different voice-based options to use for at least five minutes per day. Some were told to carry out open-ended chats about anything they wanted; others were told to have personal or non-personal conversations with the service. The researchers found that people who tend to get more emotionally attached in human relationships and are more trusting of the chatbot were more likely to feel lonelier and more emotionally dependent on ChatGPT. The researchers didn't find that a more engaging voice led to a more negative outcome, they the second study, researchers used software to analyze 3 million user conversations with ChatGPT and also surveyed people about how they interact with the chatbot. They found very few people actually use ChatGPT for emotional still early days for this body of research and remains unclear how much chatbots may cause people to feel lonelier versus how much people prone to a sense of loneliness and emotional dependence may have those feelings exacerbated by chatbots. Cathy Mengying Fang, a study co-author and MIT graduate student, said the researchers are wary of people using the findings to conclude that more usage of the chatbot will necessarily have negative consequences for users. The study didn't control for the amount of time people used the chatbot as a main factor, she said, and didn't compare to a control group that doesn't use chatbots. The researchers hope the work leads to more studies on how humans interact with AI. 'Focusing on the AI itself is interesting,' said Pat Pataranutaporn, a study co-author and a postdoctoral researcher at MIT. 'But what is really critical, especially when AI is being deployed at scale, is to understand its impact on people.' More stories like this are available on ©2025 Bloomberg L.P.

OpenAI study finds links between ChatGPT use and loneliness
OpenAI study finds links between ChatGPT use and loneliness

Japan Times

time22-03-2025

  • Health
  • Japan Times

OpenAI study finds links between ChatGPT use and loneliness

Higher use of chatbots like ChatGPT may correspond with increased loneliness and less time spent socializing with other people, according to new research from OpenAI in partnership with the Massachusetts Institute of Technology. Those who spent more time typing or speaking with ChatGPT each day tended to report higher levels of emotional dependence on, and problematic use of, the chatbot, as well as heightened levels of loneliness, according to research released Friday. The findings were part of a pair of studies conducted by researchers at the two organizations and have not been peer reviewed. The launch of ChatGPT in late 2022 helped kick off a frenzy for generative artificial intelligence. Since then, people have used chatbots for everything from coding to ersatz therapy sessions. As developers like OpenAI push out more sophisticated models and voice features that make them better at mimicking the ways humans communicate, there is arguably more potential for forming parasocial relationships with these chatbots. In recent months, there have been renewed concerns about the potential emotional harms of this technology, particularly among younger users and those with mental health issues. Character Technologies was sued last year after its chatbot allegedly encouraged suicidal ideation in conversations with minors, including one 14-year-old who took his own life. San Francisco-based OpenAI sees the new studies as a way to get a better sense of how people interact with, and are affected by, its popular chatbot. "Some of our goals here have really been to empower people to understand what their usage can mean and do this work to inform responsible design,' said Sandhini Agarwal, who heads OpenAI's trustworthy AI team and co-authored the research. To conduct the studies, the researchers followed nearly 1,000 people for a month. Participants had a wide range of prior experience with ChatGPT and were randomly assigned a text-only version of it or one of two different voice-based options to use for at least five minutes per day. Some were told to carry out open-ended chats about anything they wanted; others were told to have personal or nonpersonal conversations with the service. The researchers found that people who tend to get more emotionally attached in human relationships and are more trusting of the chatbot were more likely to feel lonelier and more emotionally dependent on ChatGPT. The researchers didn't find that a more engaging voice led to a more negative outcome, they said. In the second study, researchers used software to analyze 3 million user conversations with ChatGPT and also surveyed people about how they interact with the chatbot. They found very few people actually use ChatGPT for emotional conversations. It's still early days for this body of research and remains unclear how much chatbots may cause people to feel lonelier versus how much people prone to a sense of loneliness and emotional dependence may have those feelings exacerbated by chatbots. Cathy Mengying Fang, a study co-author and MIT graduate student, said the researchers are wary of people using the findings to conclude that more usage of the chatbot will necessarily have negative consequences for users. The study didn't control for the amount of time people used the chatbot as a main factor, she said, and didn't compare to a control group that doesn't use chatbots. The researchers hope the work leads to more studies on how humans interact with AI. "Focusing on the AI itself is interesting,' said Pat Pataranutaporn, a study co-author and a postdoctoral researcher at MIT. "But what is really critical, especially when AI is being deployed at scale, is to understand its impact on people.'

OpenAI Study Finds Links Between ChatGPT Use and Loneliness
OpenAI Study Finds Links Between ChatGPT Use and Loneliness

Yahoo

time21-03-2025

  • Business
  • Yahoo

OpenAI Study Finds Links Between ChatGPT Use and Loneliness

(Bloomberg) -- Higher use of chatbots like ChatGPT may correspond with increased loneliness and less time spent socializing with other people, according to new research from OpenAI in partnership with the Massachusetts Institute of Technology. New York Subway Ditches MetroCard After 32 Years for Tap-And-Go LA Faces $1 Billion Budget Hole, Warns of Thousands of Layoffs Despite Cost-Cutting Moves, Trump Plans to Remake DC in His Style Chicago Transit Faces 'Doomsday Scenario,' Regional Agency Says Amtrak CEO Departs Amid Threats of a Transit Funding Pullback Those who spent more time typing or speaking with ChatGPT each day tended to report higher levels of emotional dependence on, and problematic use of, the chatbot, as well as heightened levels of loneliness, according to research released Friday. The findings were part of a pair of studies conducted by researchers at the two organizations and have not been peer reviewed. The launch of ChatGPT in late 2022 helped kick off a frenzy for generative artificial intelligence. Since then, people have used chatbots for everything from coding to ersatz therapy sessions. As developers like OpenAI push out more sophisticated models and voice features that make them better at mimicking the ways humans communicate, there is arguably more potential for forming parasocial relationships with these chatbots. In recent months, there have been renewed concerns about the potential emotional harms of this technology, particularly among younger users and those with mental health issues. Character Technologies Inc. was sued last year after its chatbot allegedly encouraged suicidal ideation in conversations with minors, including one 14-year-old who took his own life. San Francisco-based OpenAI sees the new studies as a way to get a better sense of how people interact with, and are affected by, its popular chatbot. 'Some of our goals here have really been to empower people to understand what their usage can mean and do this work to inform responsible design,' said Sandhini Agarwal, who heads OpenAI's trustworthy AI team and co-authored the research. To conduct the studies, the researchers followed nearly 1,000 people for a month. Participants had a wide range of prior experience with ChatGPT and were randomly assigned a text-only version of it or one of two different voice-based options to use for at least five minutes per day. Some were told to carry out open-ended chats about anything they wanted; others were told to have personal or non-personal conversations with the service. The researchers found that people who tend to get more emotionally attached in human relationships and are more trusting of the chatbot were more likely to feel lonelier and more emotionally dependent on ChatGPT. The researchers didn't find a difference in outcomes between users who interacted with versions of the chatbot that used a more or less engaging voice, they the second study, researchers used software to analyze 3 million user conversations with ChatGPT and also surveyed people about how they interact with the chatbot. They found very few people actually use ChatGPT for emotional still early days for this body of research and remains unclear how much chatbots may cause people to feel lonelier versus how much people prone to a sense of loneliness and emotional dependence may have those feelings exacerbated by chatbots. Cathy Mengying Fang, a study co-author and MIT graduate student, said the researchers are wary of people using the findings to conclude that more usage of the chatbot will necessarily have negative consequences for users. The study didn't control for the amount of time people used the chatbot as a main factor, she said, and didn't compare to a control group that doesn't use chatbots. The researchers hope the work leads to more studies on how humans interact with AI. 'Focusing on the AI itself is interesting,' said Pat Pataranutaporn, a study co-author and a postdoctoral researcher at MIT. 'But what is really critical, especially when AI is being deployed at scale, is to understand its impact on people.' A New 'China Shock' Is Destroying Jobs Around the World Tesla's Gamble on MAGA Customers Won't Work How TD Became America's Most Convenient Bank for Money Launderers One Man's Crypto Windfall Is Funding a $1 Billion Space Station Dream The Real Reason Trump Is Pushing 'Buy American' ©2025 Bloomberg L.P. Sign in to access your portfolio

OpenAI Study Finds Links Between ChatGPT Use and Loneliness
OpenAI Study Finds Links Between ChatGPT Use and Loneliness

Yahoo

time21-03-2025

  • Business
  • Yahoo

OpenAI Study Finds Links Between ChatGPT Use and Loneliness

(Bloomberg) -- Higher use of chatbots like ChatGPT may correspond with increased loneliness and less time spent socializing with other people, according to new research from OpenAI in partnership with the Massachusetts Institute of Technology. New York Subway Ditches MetroCard After 32 Years for Tap-And-Go LA Faces $1 Billion Budget Hole, Warns of Thousands of Layoffs Despite Cost-Cutting Moves, Trump Plans to Remake DC in His Style Amtrak CEO Departs Amid Threats of a Transit Funding Pullback Chicago Transit Faces 'Doomsday Scenario,' Regional Agency Says Those who spent more time typing or speaking with ChatGPT each day tended to report higher levels of emotional dependence on, and problematic use of, the chatbot, as well as heightened levels of loneliness, according to research released Friday. The findings were part of a pair of studies conducted by researchers at the two organizations and have not been peer reviewed. The launch of ChatGPT in late 2022 helped kick off a frenzy for generative artificial intelligence. Since then, people have used chatbots for everything from coding to ersatz therapy sessions. As developers like OpenAI push out more sophisticated models and voice features that make them better at mimicking the ways humans communicate, there is arguably more potential for forming parasocial relationships with these chatbots. In recent months, there have been renewed concerns about the potential emotional harms of this technology, particularly among younger users and those with mental health issues. Character Technologies Inc. was sued last year after its chatbot allegedly encouraged suicidal ideation in conversations with minors, including one 14-year-old who took his own life. San Francisco-based OpenAI sees the new studies as a way to get a better sense of how people interact with, and are affected by, its popular chatbot. 'Some of our goals here have really been to empower people to understand what their usage can mean and do this work to inform responsible design,' said Sandhini Agarwal, who heads OpenAI's trustworthy AI team and co-authored the research. To conduct the studies, the researchers followed nearly 1,000 people for a month. Participants had a wide range of prior experience with ChatGPT and were randomly assigned a text-only version of it or one of two different voice-based options to use for at least five minutes per day. Some were told to carry out open-ended chats about anything they wanted; others were told to have personal or non-personal conversations with the service. The researchers found that people who tend to get more emotionally attached in human relationships and are more trusting of the chatbot were more likely to feel lonelier and more emotionally dependent on ChatGPT. The researchers didn't find a difference in outcomes between users who interacted with versions of the chatbot that used a more or less engaging voice, they the second study, researchers used software to analyze 3 million user conversations with ChatGPT and also surveyed people about how they interact with the chatbot. They found very few people actually use ChatGPT for emotional still early days for this body of research and remains unclear how much chatbots may cause people to feel lonelier verses how much people prone to a sense of loneliness and emotional dependence may have those feelings exacerbated by chatbots. Cathy Mengying Fang, a study co-author and MIT graduate student, said the researchers are wary of people using the findings to conclude that more usage of the chatbot will necessarily have negative consequences for users. The study didn't control for the amount of time people used the chatbot as a main factor, she said, and didn't compare to a control group that doesn't use chatbots. The researchers hope the work leads to more studies on how humans interact with AI. 'Focusing on the AI itself is interesting,' said Pat Pataranutaporn, a study co-author and a postdoctoral researcher at MIT. 'But what is really critical, especially when AI is being deployed at scale, is to understand its impact on people.' A New 'China Shock' Is Destroying Jobs Around the World Tesla's Gamble on MAGA Customers Won't Work How TD Became America's Most Convenient Bank for Money Launderers One Man's Crypto Windfall Is Funding a $1 Billion Space Station Dream The Real Reason Trump Is Pushing 'Buy American' ©2025 Bloomberg L.P.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store