logo
Like real-life Dr. Dolittles, scientists are using AI to decode animal communication

Like real-life Dr. Dolittles, scientists are using AI to decode animal communication

CBCa day ago

Bottlenose dolphins are known for their intelligence, and now researchers are trying to find out whether we could one day communicate with them in their own language.
Researchers from Woods Hole Oceanographic Institution (WHOI) in Massachusetts and the Sarasota Dolphin Research Program in Florida are using artificial intelligence to decode the meaning behind dolphin whistles.
"Our objective is to understand their rules of communication, what the structure, function, and meaning of dolphin communication is," Frants Havmann Jensen, an investigator at WHOI's Marine Research Facility, told The Current's Matt Galloway.
"So, not just identifying the sounds they make but uncovering what those sounds mean to them."
In May, the researchers were awarded the Coller Dolittle Challenge for Interspecies Two-Way Communication for that work. It honours researchers who've made significant scientific advances that could pave the way for human-animal communication.
Yossi Yovel, who led the judging panel for the Coller Dolittle prize, says the Jeremy Coller Foundation is interested in unlocking a deeper understanding of language, across species.
"By understanding how communication has evolved across many different species, we can better understand the evolutionary roots of communication and language," he said.
Yovel says understanding the signals and the messages they convey is a crucial first step to decoding bottlenose dolphins' communication system.
From there, scientists can begin to understand how dolphins organize signals when they're communicating to create what humans would understand as sentences.
"The next step would be to present signals that you've discovered to the animal and observe their response, and to show that you can do this in multiple contexts," he said.
Using AI to enhance understanding
The Sarasota Dolphin Research Program is conducting the world's longest-running study of a wild dolphin population. Since 1970, they've built a database of sounds from over 300 dolphins.
Jensen says bottlenose dolphins have distinct, individual sounds researchers call signature whistles.
"It's the dolphin equivalent of a human name. Dolphins use these signature whistles to maintain social bonds and recognize each other," he said.
Dolphins also make non-signature whistles, which comprise approximately 50 per cent of the whistles they produce, but there's little research in this area. The study published by the winning team suggests that the non-signature whistles could function like words with mutually understood, context-specific meanings.
Jensen says AI can help researchers decode the dolphins's communication by automatically detecting and discovering new shared whistle types.
"We're looking into how to use it for identifying patterns of use across individuals and contexts so that we can begin to infer meaning from how dolphins use these," he said.
Jensen and other researchers say one of AI's strengths is its ability to process large amounts of data.
Sophie Cohen-Bodénès and her team at Washington University in St. Louis, Mo., — who were shortlisted for the prize — are using AI to decipher patterns in cuttlefish arm wave signals, a form of sign language.
Through non-invasive behavioural experiments, Cohen-Bodénès examined that the creatures interpret arm signs using vision and vibrations.
"We're in the process of collecting large datasets from many behavioural contexts to give to the AI algorithm that could find, in an objective way, the different correlations between different arm signs," she said.
Cohen-Bodénès says her research goal is to gain more insight into the meaning of animal communication displays and their underlying sensory mechanisms.
"It's a way to better assess their welfare, to better understand their needs and to improve their protection."
WATCH | Dolphins circle space capsule:
#TheMoment dolphins greeted the capsule returning astronauts to Earth
3 months ago
Duration 1:04
Marine mammal expert Ashley Noseworthy recounts the moment a pod of dolphins greeted the SpaceX capsule carrying NASA astronauts returning from nine months stuck in space.
Limits of AI
Yovel says AI is a powerful tool but it has shortcomings.
When researchers consider the meaning and context of their findings, he says they base them on human observations, which could be limited or wrong.
Yovel believes humans might be able to communicate with animals, but he's skeptical that AI could be used in a device or algorithm that would allow them to have a conversation.
Human communication is complicated and allows people to discuss a wide range of topics, and Yovel doubts animal communication systems are as complex.
"We have this language, which is extremely complex, and it seems to stand out in comparison to other animal communication systems," he said.
In order to understand whether human language systems stand out and how, Yovel says humans need a better understanding of nature.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Like real-life Dr. Dolittles, scientists are using AI to decode animal communication
Like real-life Dr. Dolittles, scientists are using AI to decode animal communication

CBC

timea day ago

  • CBC

Like real-life Dr. Dolittles, scientists are using AI to decode animal communication

Bottlenose dolphins are known for their intelligence, and now researchers are trying to find out whether we could one day communicate with them in their own language. Researchers from Woods Hole Oceanographic Institution (WHOI) in Massachusetts and the Sarasota Dolphin Research Program in Florida are using artificial intelligence to decode the meaning behind dolphin whistles. "Our objective is to understand their rules of communication, what the structure, function, and meaning of dolphin communication is," Frants Havmann Jensen, an investigator at WHOI's Marine Research Facility, told The Current's Matt Galloway. "So, not just identifying the sounds they make but uncovering what those sounds mean to them." In May, the researchers were awarded the Coller Dolittle Challenge for Interspecies Two-Way Communication for that work. It honours researchers who've made significant scientific advances that could pave the way for human-animal communication. Yossi Yovel, who led the judging panel for the Coller Dolittle prize, says the Jeremy Coller Foundation is interested in unlocking a deeper understanding of language, across species. "By understanding how communication has evolved across many different species, we can better understand the evolutionary roots of communication and language," he said. Yovel says understanding the signals and the messages they convey is a crucial first step to decoding bottlenose dolphins' communication system. From there, scientists can begin to understand how dolphins organize signals when they're communicating to create what humans would understand as sentences. "The next step would be to present signals that you've discovered to the animal and observe their response, and to show that you can do this in multiple contexts," he said. Using AI to enhance understanding The Sarasota Dolphin Research Program is conducting the world's longest-running study of a wild dolphin population. Since 1970, they've built a database of sounds from over 300 dolphins. Jensen says bottlenose dolphins have distinct, individual sounds researchers call signature whistles. "It's the dolphin equivalent of a human name. Dolphins use these signature whistles to maintain social bonds and recognize each other," he said. Dolphins also make non-signature whistles, which comprise approximately 50 per cent of the whistles they produce, but there's little research in this area. The study published by the winning team suggests that the non-signature whistles could function like words with mutually understood, context-specific meanings. Jensen says AI can help researchers decode the dolphins's communication by automatically detecting and discovering new shared whistle types. "We're looking into how to use it for identifying patterns of use across individuals and contexts so that we can begin to infer meaning from how dolphins use these," he said. Jensen and other researchers say one of AI's strengths is its ability to process large amounts of data. Sophie Cohen-Bodénès and her team at Washington University in St. Louis, Mo., — who were shortlisted for the prize — are using AI to decipher patterns in cuttlefish arm wave signals, a form of sign language. Through non-invasive behavioural experiments, Cohen-Bodénès examined that the creatures interpret arm signs using vision and vibrations. "We're in the process of collecting large datasets from many behavioural contexts to give to the AI algorithm that could find, in an objective way, the different correlations between different arm signs," she said. Cohen-Bodénès says her research goal is to gain more insight into the meaning of animal communication displays and their underlying sensory mechanisms. "It's a way to better assess their welfare, to better understand their needs and to improve their protection." WATCH | Dolphins circle space capsule: #TheMoment dolphins greeted the capsule returning astronauts to Earth 3 months ago Duration 1:04 Marine mammal expert Ashley Noseworthy recounts the moment a pod of dolphins greeted the SpaceX capsule carrying NASA astronauts returning from nine months stuck in space. Limits of AI Yovel says AI is a powerful tool but it has shortcomings. When researchers consider the meaning and context of their findings, he says they base them on human observations, which could be limited or wrong. Yovel believes humans might be able to communicate with animals, but he's skeptical that AI could be used in a device or algorithm that would allow them to have a conversation. Human communication is complicated and allows people to discuss a wide range of topics, and Yovel doubts animal communication systems are as complex. "We have this language, which is extremely complex, and it seems to stand out in comparison to other animal communication systems," he said. In order to understand whether human language systems stand out and how, Yovel says humans need a better understanding of nature.

AI chatbots need more books to learn from
AI chatbots need more books to learn from

CTV News

time3 days ago

  • CTV News

AI chatbots need more books to learn from

CAMBRIDGE, Mass — Everything ever said on the internet was just the start of teaching artificial intelligence about humanity. Tech companies are now tapping into an older repository of knowledge: the library stacks. Nearly one million books published as early as the 15th century — and in 254 languages — are part of a Harvard University collection being released to AI researchers Thursday. Also coming soon are troves of old newspapers and government documents held by Boston's public library. Cracking open the vaults to centuries-old tomes could be a data bonanza for tech companies battling lawsuits from living novelists, visual artists and others whose creative works have been scooped up without their consent to train AI chatbots. 'It is a prudent decision to start with public domain data because that's less controversial right now than content that's still under copyright,' said Burton Davis, a deputy general counsel at Microsoft. Davis said libraries also hold 'significant amounts of interesting cultural, historical and language data' that's missing from the past few decades of online commentary that AI chatbots have mostly learned from. Supported by 'unrestricted gifts' from Microsoft and ChatGPT maker OpenAI, the Harvard-based Institutional Data Initiative is working with libraries around the world on how to make their historic collections AI-ready in a way that also benefits libraries and the communities they serve. 'We're trying to move some of the power from this current AI moment back to these institutions,' said Aristana Scourtas, who manages research at Harvard Law School's Library Innovation Lab. 'Librarians have always been the stewards of data and the stewards of information.' Harvard's newly released dataset, Institutional Books 1.0, contains more than 394 million scanned pages of paper. One of the earlier works is from the 1400s — a Korean painter's handwritten thoughts about cultivating flowers and trees. The largest concentration of works is from the 19th century, on subjects such as literature, philosophy, law and agriculture, all of it meticulously preserved and organized by generations of librarians. It promises to be a boon for AI developers trying to improve the accuracy and reliability of their systems. 'A lot of the data that's been used in AI training has not come from original sources,' said the data initiative's executive director, Greg Leppert, who is also chief technologist at Harvard's Berkman Klein Center for Internet & Society. This book collection goes 'all the way back to the physical copy that was scanned by the institutions that actually collected those items,' he said. Before ChatGPT sparked a commercial AI frenzy, most AI researchers didn't think much about the provenance of the passages of text they pulled from Wikipedia, from social media forums like Reddit and sometimes from deep repositories of pirated books. They just needed lots of what computer scientists call tokens — units of data, each of which can represent a piece of a word. Harvard's new AI training collection has an estimated 242 billion tokens, an amount that's hard for humans to fathom but it's still just a drop of what's being fed into the most advanced AI systems. Facebook parent company Meta, for instance, has said the latest version of its AI large language model was trained on more than 30 trillion tokens pulled from text, images and videos. Meta is also battling a lawsuit from comedian Sarah Silverman and other published authors who accuse the company of stealing their books from 'shadow libraries' of pirated works. Now, with some reservations, the real libraries are standing up. OpenAI, which is also fighting a string of copyright lawsuits, donated US$50 million this year to a group of research institutions including Oxford University's 400-year-old Bodleian Library, which is digitizing rare texts and using AI to help transcribe them. When the company first reached out to the Boston Public Library, one of the biggest in the U.S., the library made clear that any information it digitized would be for everyone, said Jessica Chapel, its chief of digital and online services. 'OpenAI had this interest in massive amounts of training data. We have an interest in massive amounts of digital objects. So this is kind of just a case that things are aligning,' Chapel said. Digitization is expensive. It's been painstaking work, for instance, for Boston's library to scan and curate dozens of New England's French-language newspapers that were widely read in the late 19th and early 20th century by Canadian immigrant communities from Quebec. Now that such text is of use as training data, it helps bankroll projects that librarians want to do anyway. 'We've been very clear that, 'Hey, we're a public library,'' Chapel said. 'Our collections are held for public use, and anything we digitized as part of this project will be made public.' Harvard's collection was already digitized starting in 2006 for another tech giant, Google, in its controversial project to create a searchable online library of more than 20 million books. Google spent years beating back legal challenges from authors to its online book library, which included many newer and copyrighted works. It was finally settled in 2016 when the U.S. Supreme Court let stand lower court rulings that rejected copyright infringement claims. Now, for the first time, Google has worked with Harvard to retrieve public domain volumes from Google Books and clear the way for their release to AI developers. Copyright protections in the U.S. typically last for 95 years, and longer for sound recordings. How useful all of this will be for the next generation of AI tools remains to be seen as the data gets shared Thursday on the Hugging Face platform, which hosts datasets and open-source AI models that anyone can download. The book collection is more linguistically diverse than typical AI data sources. Fewer than half the volumes are in English, though European languages still dominate, particularly German, French, Italian, Spanish and Latin. A book collection steeped in 19th century thought could also be 'immensely critical' for the tech industry's efforts to build AI agents that can plan and reason as well as humans, Leppert said. 'At a university, you have a lot of pedagogy around what it means to reason,' Leppert said. 'You have a lot of scientific information about how to run processes and how to run analyses.' At the same time, there's also plenty of outdated data, from debunked scientific and medical theories to racist narratives. 'When you're dealing with such a large data set, there are some tricky issues around harmful content and language,' said Kristi Mukk, a coordinator at Harvard's Library Innovation Lab who said the initiative is trying to provide guidance about mitigating the risks of using the data, to 'help them make their own informed decisions and use AI responsibly.' ———— The Associated Press and OpenAI have a licensing and technology agreement that allows OpenAI access to part of AP's text archives. Matt O'brien, The Associated Press

Pluristyx Launches PluriForm™ Organoid Kit, Slashing Weeks Off Organoid Development Timelines
Pluristyx Launches PluriForm™ Organoid Kit, Slashing Weeks Off Organoid Development Timelines

National Post

time3 days ago

  • National Post

Pluristyx Launches PluriForm™ Organoid Kit, Slashing Weeks Off Organoid Development Timelines

Article content SEATTLE — Pluristyx, a leading provider of tools and services for cell product development, today announced the launch of their first-of-its-kind PluriForm™ Organoid Kit, a turnkey solution to eliminate critical bottlenecks in organoid research and allow scientists to rapidly and reliably make pluripotent aggregates using quality-assured, induced pluripotent stem cells (iPSCs). The kit saves weeks of cell culture work and eliminates variability in organoid manufacturing, allowing reproducible and iterative development and application of organoids. Article content Organoids are three-dimensional cellular models that recapitulate key aspects of organ function. They are used in safety and toxicology screening, drug discovery, disease modeling, and personalized medicine and could replace many instances where animal testing is required. However, organoid use has been hindered by lengthy and highly variable processes to make iPSC aggregates. The PluriForm Organoid Kit solves this challenge by providing a ready-to-use system that includes cryopreserved, Ready-to-Differentiate® (RTD®), suspension-adapted iPSCs and optimized media with a simple protocol. Each kit contains a vial of 25 million cells and all necessary reagents. Within minutes, the end-user can combine the components and generate thousands of uniform aggregates in just one day. These pluripotent aggregates display consistent morphology and size distribution, critical parameters for reproducible differentiation to a wide range of organoids, including neuronal, liver, intestinal, pancreatic, kidney, and cardiac models. Article content 'Our goal is to accelerate the pace of discovery in the pharmaceutical and biotech industries,' said Dr. Benjamin Fryer, Co-founder and CEO of Pluristyx. 'With PluriForm, we have addressed consistent feedback that the initial step of creating reproducible cell aggregates is a major source of delay and inconsistency in organoid workflows.' Article content The FDA's Roadmap to Reduced Animal Testing in Preclinical Safety Studies, released in April 2025, explicitly advocates use of organoids to replace animal models. The PluriForm™ Organoid Kit is essential to enabling and accelerating the transition away from use of animals and other less-than-optimal cell assays and is now available for purchase through Pluristyx direct sales channels. Article content About Pluristyx Article content Article content Article content Article content Contacts Article content Media Contact Article content Article content Article content

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store