Latest news with #animalcommunication
Yahoo
3 days ago
- General
- Yahoo
We're close to translating animal languages – what happens then?
Charles Darwin suggested that humans learned to speak by mimicking birdsong: our ancestors' first words may have been a kind of interspecies exchange. Perhaps it won't be long before we join the conversation once again. The race to translate what animals are saying is heating up, with riches as well as a place in history at stake. The Jeremy Coller Foundation has promised $10m to whichever researchers can crack the code. This is a race fuelled by generative AI; large language models can sort through millions of recorded animal vocalisations to find their hidden grammars. Most projects focus on cetaceans because, like us, they learn through vocal imitation and, also like us, they communicate via complex arrangements of sound that appear to have structure and hierarchy. Sperm whales communicate in codas – rapid sequences of clicks, each as brief as 1,000th of a second. Project Ceti (the Cetacean Translation Initiative) is using AI to analyse codas in order to reveal the mysteries of sperm whale speech. There is evidence the animals take turns, use specific clicks to refer to one another, and even have distinct dialects. Ceti has already isolated a click that may be a form of punctuation, and they hope to speak whaleish as soon as 2026. The linguistic barrier between species is already looking porous. Last month, Google released DolphinGemma, an AI program to translate dolphins, trained on 40 years of data. In 2013, scientists using an AI algorithm to sort dolphin communication identified a new click in the animals' interactions with one another, which they recognised as a sound they had previously trained the pod to associate with sargassum seaweed – the first recorded instance of a word passing from one species into another's native vocabulary. Humpback whale songs are incredible vocal performances, sometimes lasting up to 24 hours The prospect of speaking dolphin or whale is irresistible. And it seems that they are just as enthusiastic. In November last year, scientists in Alaska recorded an acoustic 'conversation' with a humpback whale called Twain, in which they exchanged a call-and-response form known as 'whup/throp' with the animal over a 20-minute period. In Florida, a dolphin named Zeus was found to have learned to mimic the vowel sounds, A, E, O, and U. But in the excitement we should not ignore the fact that other species are already bearing eloquent witness to our impact on the natural world. A living planet is a loud one. Healthy coral reefs pop and crackle with life. But soundscapes can decay just as ecosystems can. Degraded reefs are hushed deserts. Since the 1960s, shipping and mining have raised background noise in the oceans by about three decibels a decade. Humpback whale song occupies the same low-frequency bandwidth as deep-sea dredging and drilling for the rare earths that are vital for electronic devices. Ironically, mining the minerals we need to communicate cancels out whales' voices. Humpback whale songs are incredible vocal performances, sometimes lasting up to 24 hours. 'Song' is apt: they seem to include rhymed phrases, and their compositions travel the oceans with them, evolving as they go in a process called 'song revolutions', where a new cycle replaces the old. (Imagine if Nina Simone or the Beatles had erased their back catalogue with every new release.) They're crucial to migration and breeding seasons. But in today's louder soundscape, whale song is crowded out of its habitual bandwidth and even driven to silence – from up to 1.2 km away from commercial ships, humpback whales will cease singing rather than compete with the noise. In interspecies translation, sound only takes us so far. Animals communicate via an array of visual, chemical, thermal and mechanical cues, inhabiting worlds of perception very different to ours. Can we really understand what sound means to echolocating animals, for whom sound waves can be translated visually? The German ecologist Jakob von Uexküll called these impenetrable worlds umwelten. To truly translate animal language, we would need to step into that animal's umwelt – and then, what of us would be imprinted on her, or her on us? 'If a lion could talk,' writes Stephen Budiansky, revising Wittgenstein's famous aphorism in Philosophical Investigations, 'we probably could understand him. He just would not be a lion any more.' We should ask, then, how speaking with other beings might change us. Talking to another species might be very like talking to alien life. It's no coincidence that Ceti echoes Nasa's Seti – Search for Extraterrestrial Intelligence – Institute. In fact, a Seti team recorded the whup/throp exchange, on the basis that learning to speak with whales may help us if we ever meet intelligent extraterrestrials. In Denis Villeneuve's movie Arrival, whale-like aliens communicate via a script in which the distinction between past, present and future times collapses. For Louise, the linguist who translates the script, learning Heptapod lifts her mind out of linear time and into a reality in which her own past and future are equally available. The film mentions Edward Sapir and Benjamin Whorf's theory of linguistic determinism – the idea that our experience of reality is encoded in language – to explain this. The Sapir-Whorf hypothesis was dismissed in the mid-20th century, but linguists have since argued that there may be some truth to it. Pormpuraaw speakers in northern Australia refer to time moving from east to west, rather than forwards or backwards as in English, making time indivisible from the relationship between their body and the land. Whale songs are born from an experience of time that is radically different to ours. Humpbacks can project their voices over miles of open water; their songs span the widest oceans. Imagine the swell of oceanic feeling on which such sounds are borne. Speaking whale would expand our sense of space and time into a planetary song. I imagine we'd think very differently about polluting the ocean soundscape so carelessly. Where it counts, we are perfectly able to understand what nature has to say; the problem is, we choose not to. As incredible as it would be to have a conversation with another species, we ought to listen better to what they are already telling us. • David Farrier is the author of Nature's Genius: Evolution's Lessons for a Changing Planet (Canongate). Why Animals Talk by Arik Kershenbaum (Viking, £10.99) Philosophical Investigations by Ludwig Wittgenstein (Wiley-Blackwell, £24.95) An Immense World by Ed Yong (Vintage, £12.99)


The Guardian
3 days ago
- General
- The Guardian
We're close to translating animal languages – what happens then?
Charles Darwin suggested that humans learned to speak by mimicking birdsong: our ancestors' first words may have been a kind of interspecies exchange. Perhaps it won't be long before we join the conversation once again. The race to translate what animals are saying is heating up, with riches as well as a place in history at stake. The Jeremy Coller Foundation has promised $10m to whichever researchers can crack the code. This is a race fuelled by generative AI; large language models can sort through millions of recorded animal vocalisations to find their hidden grammars. Most projects focus on cetaceans because, like us, they learn through vocal imitation and, also like us, they communicate via complex arrangements of sound that appear to have structure and hierarchy. Sperm whales communicate in codas – rapid sequences of clicks, each as brief as 1,000th of a second. Project Ceti (the Cetacean Translation Initiative) is using AI to analyse codas in order to reveal the mysteries of sperm whale speech. There is evidence the animals take turns, use specific clicks to refer to one another, and even have distinct dialects. Ceti has already isolated a click that may be a form of punctuation, and they hope to speak whaleish as soon as 2026. The linguistic barrier between species is already looking porous. Last month, Google released DolphinGemma, an AI program to translate dolphins, trained on 40 years of data. In 2013, scientists using an AI algorithm to sort dolphin communication identified a new click in the animals' interactions with one another, which they recognised as a sound they had previously trained the pod to associate with sargassum seaweed – the first recorded instance of a word passing from one species into another's native vocabulary. The prospect of speaking dolphin or whale is irresistible. And it seems that they are just as enthusiastic. In November last year, scientists in Alaska recorded an acoustic 'conversation' with a humpback whale called Twain, in which they exchanged a call-and-response form known as 'whup/throp' with the animal over a 20-minute period. In Florida, a dolphin named Zeus was found to have learned to mimic the vowel sounds, A, E, O, and U. But in the excitement we should not ignore the fact that other species are already bearing eloquent witness to our impact on the natural world. A living planet is a loud one. Healthy coral reefs pop and crackle with life. But soundscapes can decay just as ecosystems can. Degraded reefs are hushed deserts. Since the 1960s, shipping and mining have raised background noise in the oceans by about three decibels a decade. Humpback whale song occupies the same low-frequency bandwidth as deep-sea dredging and drilling for the rare earths that are vital for electronic devices. Ironically, mining the minerals we need to communicate cancels out whales' voices. Humpback whale songs are incredible vocal performances, sometimes lasting up to 24 hours. 'Song' is apt: they seem to include rhymed phrases, and their compositions travel the oceans with them, evolving as they go in a process called 'song revolutions', where a new cycle replaces the old. (Imagine if Nina Simone or the Beatles had erased their back catalogue with every new release.) They're crucial to migration and breeding seasons. But in today's louder soundscape, whale song is crowded out of its habitual bandwidth and even driven to silence – from up to 1.2 km away from commercial ships, humpback whales will cease singing rather than compete with the noise. In interspecies translation, sound only takes us so far. Animals communicate via an array of visual, chemical, thermal and mechanical cues, inhabiting worlds of perception very different to ours. Can we really understand what sound means to echolocating animals, for whom sound waves can be translated visually? The German ecologist Jakob von Uexküll called these impenetrable worlds umwelten. To truly translate animal language, we would need to step into that animal's umwelt – and then, what of us would be imprinted on her, or her on us? 'If a lion could talk,' writes Stephen Budiansky, revising Wittgenstein's famous aphorism in Philosophical Investigations, 'we probably could understand him. He just would not be a lion any more.' We should ask, then, how speaking with other beings might change us. Talking to another species might be very like talking to alien life. It's no coincidence that Ceti echoes Nasa's Seti – Search for Extraterrestrial Intelligence – Institute. In fact, a Seti team recorded the whup/throp exchange, on the basis that learning to speak with whales may help us if we ever meet intelligent extraterrestrials. In Denis Villeneuve's movie Arrival, whale-like aliens communicate via a script in which the distinction between past, present and future times collapses. For Louise, the linguist who translates the script, learning Heptapod lifts her mind out of linear time and into a reality in which her own past and future are equally available. The film mentions Edward Sapir and Benjamin Whorf's theory of linguistic determinism – the idea that our experience of reality is encoded in language – to explain this. The Sapir-Whorf hypothesis was dismissed in the mid-20th century, but linguists have since argued that there may be some truth to it. Pormpuraaw speakers in northern Australia refer to time moving from east to west, rather than forwards or backwards as in English, making time indivisible from the relationship between their body and the land. Whale songs are born from an experience of time that is radically different to ours. Humpbacks can project their voices over miles of open water; their songs span the widest oceans. Imagine the swell of oceanic feeling on which such sounds are borne. Speaking whale would expand our sense of space and time into a planetary song. I imagine we'd think very differently about polluting the ocean soundscape so carelessly. Sign up to Inside Saturday The only way to get a look behind the scenes of the Saturday magazine. Sign up to get the inside story from our top writers as well as all the must-read articles and columns, delivered to your inbox every weekend. after newsletter promotion Where it counts, we are perfectly able to understand what nature has to say; the problem is, we choose not to. As incredible as it would be to have a conversation with another species, we ought to listen better to what they are already telling us. David Farrier is the author of Nature's Genius: Evolution's Lessons for a Changing Planet (Canongate). Why Animals Talk by Arik Kershenbaum (Viking, £10.99) Philosophical Investigations by Ludwig Wittgenstein (Wiley-Blackwell, £24.95) An Immense World by Ed Yong (Vintage, £12.99)


Gizmodo
17-05-2025
- Science
- Gizmodo
AI Is Deciphering Animal Speech. Should We Try to Talk Back?
Chirps, trills, growls, howls, squawks. Animals converse in all kinds of ways, yet humankind has only scratched the surface of how they communicate with each other and the rest of the living world. Our species has trained some animals—and if you ask cats, animals have trained us, too—but we've yet to truly crack the code on interspecies communication. Increasingly, animal researchers are deploying artificial intelligence to accelerate our investigations of animal communication—both within species and between branches on the tree of life. As scientists chip away at the complex communication systems of animals, they move closer to understanding what creatures are saying—and maybe even how to talk back. But as we try to bridge the linguistic gap between humans and animals, some experts are raising valid concerns about whether such capabilities are appropriate—or whether we should even attempt to communicate with animals at all. Using AI to untangle animal language Towards the front of the pack—or should I say pod?—is Project CETI, which has used machine learning to analyze more than 8,000 sperm whale 'codas'—structured click patterns recorded by the Dominica Sperm Whale Project. Researchers uncovered contextual and combinatorial structures in the whales' clicks, naming features like 'rubato' and 'ornamentation' to describe how whales subtly adjust their vocalizations during conversation. These patterns helped the team create a kind of phonetic alphabet for the animals—an expressive, structured system that may not be language as we know it but reveals a level of complexity that researchers weren't previously aware of. Project CETI is also working on ethical guidelines for the technology, a critical goal given the risks of using AI to 'talk' to the animals. Meanwhile, Google and the Wild Dolphin Project recently introduced DolphinGemma, a large language model (LLM) trained on 40 years of dolphin vocalizations. Just as ChatGPT is an LLM for human inputs—taking visual information like research papers and images and producing responses to relevant queries—DolphinGemma intakes dolphin sound data and predicts what vocalization comes next. DolphinGemma can even generate dolphin-like audio, and the researchers' prototype two-way system, Cetacean Hearing Augmentation Telemetry (fittingly, CHAT), uses a smartphone-based interface that dolphins employ to request items like scarves or seagrass—potentially laying the groundwork for future interspecies dialogue. 'DolphinGemma is being used in the field this season to improve our real-time sound recognition in the CHAT system,' said Denise Herzing, founder and director of the Wild Dolphin Project, which spearheaded the development of DolphinGemma in collaboration with researchers at Google DeepMind, in an email to Gizmodo. 'This fall we will spend time ingesting known dolphin vocalizations and let Gemma show us any repeatable patterns they find,' such as vocalizations used in courtship and mother-calf discipline. In this way, Herzing added, the AI applications are two-fold: Researchers can use it both to explore dolphins' natural sounds and to better understand the animals' responses to human mimicking of dolphin sounds, which are synthetically produced by the AI CHAT system. Expanding the animal AI toolkit Outside the ocean, researchers are finding that human speech models can be repurposed to decode terrestrial animal signals, too. A University of Michigan-led team used Wav2Vec2—a speech recognition model trained on human voices—to identify dogs' emotions, genders, breeds, and even individual identities based on their barks. The pre-trained human model outperformed a version trained solely on dog data, suggesting that human language model architectures could be surprisingly effective in decoding animal communication. Of course, we need to consider the different levels of sophistication these AI models are targeting. Determining whether a dog's bark is aggressive or playful, or whether it's male or female—these are perhaps understandably easier for a model to determine than, say, the nuanced meaning encoded in sperm whale phonetics. Nevertheless, each study inches scientists closer to understanding how AI tools, as they currently exist, can be best applied to such an expansive field—and gives the AI a chance to train itself to become a more useful part of the researcher's toolkit. And even cats—often seen as aloof—appear to be more communicative than they let on. In a 2022 study out of Paris Nanterre University, cats showed clear signs of recognizing their owner's voice, but beyond that, the felines responded more intensely when spoken to directly in 'cat talk.' That suggests cats not only pay attention to what we say, but also how we say it—especially when it comes from someone they know. Earlier this month, a pair of cuttlefish researchers found evidence that the animals have a set of four 'waves,' or physical gestures, that they make to one another, as well as to human playback of cuttlefish waves. The group plans to apply an algorithm to categorize the types of waves, automatically track the creatures' movements, and understand the contexts in which the animals express themselves more rapidly. Private companies (such as Google) are also getting in on the act. Last week, China's largest search engine, Baidu, filed a patent with the country's IP administration proposing to translate animal (specifically cat) vocalizations into human language. The quick and dirty on the tech is that it would intake a trove of data from your kitty, and then use an AI model to analyze the data, determine the animal's emotional state, and output the apparent human language message your pet was trying to convey. A universal translator for animals? Together, these studies represent a major shift in how scientists are approaching animal communication. Rather than starting from scratch, research teams are building tools and models designed for humans—and making advances that would have taken much longer otherwise. The end goal could (read: could) be a kind of Rosetta Stone for the animal kingdom, powered by AI. 'We've gotten really good at analyzing human language just in the last five years, and we're beginning to perfect this practice of transferring models trained on one dataset and applying them to new data,' said Sara Keen, a behavioral ecologist and electrical engineer at the Earth Species Project, in a video call with Gizmodo. The Earth Species Project plans to launch its flagship audio-language model for animal sounds, NatureLM, this year, and a demo for NatureLM-audio is already live. With input data from across the tree of life—as well as human speech, environmental sounds, and even music detection—the model aims to become a converter of human speech into animal analogues. The model 'shows promising domain transfer from human speech to animal communication,' the project states, 'supporting our hypothesis that shared representations in AI can help decode animal languages.' 'A big part of our work really is trying to change the way people think about our place in the world,' Keen added. 'We're making cool discoveries about animal communication, but ultimately we're finding that other species are just as complicated and nuanced as we are. And that revelation is pretty exciting.' The ethical dilemma Indeed, researchers generally agree on the promise of AI-based tools for improving the collection and interpretation of animal communication data. But some feel that there's a breakdown in communication between that scholarly familiarity and the public's perception of how these tools can be applied. 'I think there's currently a lot of misunderstanding in the coverage of this topic—that somehow machine learning can create this contextual knowledge out of nothing. That so long as you have thousands of hours of audio recordings, somehow some magic machine learning black box can squeeze meaning out of that,' said Christian Rutz, an expert in animal behavior and cognition and founding president of International Bio-Logging Society, in a video call with Gizmodo. 'That's not going to happen.' 'Meaning comes through the contextual annotation and this is where I think it's really important for this field as a whole, in this period of excitement and enthusiasm, to not forget that this annotation comes from basic behavioral ecology and natural history expertise,' Rutz added. In other words, let's not put the horse before the cart, especially since the cart—in this case—is what's powering the horse. But with great power…you know the cliché. Essentially, how can humans develop and apply these technologies in a way that is both scientifically illuminating and minimizes harm or disruption to its animal subjects? Experts have put forward ethical standards and guardrails for using the technologies that prioritize the welfare of creatures as we get closer to—well, wherever the technology is going. As AI advances, conversations about animal rights will have to evolve. In the future, animals could become more active participants in those conversations—a notion that legal experts are exploring as a thought exercise, but one that could someday become reality. 'What we desperately need—apart from advancing the machine learning side—is to forge these meaningful collaborations between the machine learning experts and the animal behavior researchers,' Rutz said, 'because it's only when you put the two of us together that you stand a chance.' There's no shortage of communication data to feed into data-hungry AI models, from pitch-perfect prairie dog squeaks to snails' slimy trails (yes, really). But exactly how we make use of the information we glean from these new approaches requires thorough consideration of the ethics involved in 'speaking' with animals. A recent paper on the ethical concerns of using AI to communicate with whales outlined six major problem areas. These include privacy rights, cultural and emotional harm to whales, anthropomorphism, technological solutionism (an overreliance on technology to fix problems), gender bias, and limited effectiveness for actual whale conservation. That last issue is especially urgent, given how many whale populations are already under serious threat. It increasingly appears that we're on the brink of learning much more about the ways animals interact with one another—indeed, pulling back the curtain on their communication could also yield insights into how they learn, socialize, and act within their environments. But there are still significant challenges to overcome, such as asking ourselves how we use the powerful technologies currently in development.


South China Morning Post
15-05-2025
- Entertainment
- South China Morning Post
China sound mimicry influencer resembles square-faced monkey, plans to ‘talk' to the animal
A sound mimicry influencer in eastern China, who bears a striking resemblance to a square-faced monkey, plans to 'communicate' with the animal using his unique skills. Li Xianbi, 31, from Jiangsu province, boasts 3.4 million online followers due to his remarkable animal sound impressions. A former soldier, Li dedicated over a year to training with Niu Yuliang, a practitioner of kouji, a traditional sound mimicry art recognised as part of China's national intangible cultural heritage. This art form employs the mouth, teeth, lips, tongue, throat, and nose to recreate various sounds found in nature. Li told the mainland media outlet Dawan News that he has been attuned to sounds since childhood and has always been fascinated by animals. Li responded with good humour about his striking resemblance to the square-faced monkey, stating that he does not mind the jokes as long as they are lighthearted. Photo: Douyin He can imitate dozens of animal calls, including those of horses, cows, sheep, chickens, ducks, and pigs, with his dog sounds being the most renowned.
Yahoo
11-05-2025
- Science
- Yahoo
Chinese Tech Giant Wants to Translate Your Cat's Meows Using AI
Chinese tech company Baidu is working on an artificial intelligence-based translation system that could finally decode the greatest language mystery in the world: your cat's meows. As Reuters reports, the company filed a patent with the China National Intellectual Property Administration proposing an AI-powered system to translate animal sounds. But whether it'll ultimately be successful in deciphering your dog's barks or your cat's meows remains to be seen. Despite years of research, scientists are still far from deciphering animal communication. Baidu is hoping that the system could bring humans and their pets closer together. According to the company's patent document, it could allow for a "deeper emotional communication and understanding between animals and humans, improving the accuracy and efficiency of interspecies communication." A spokesperson told Reuters that the system is "still in the research phase," suggesting there's still significant work to be done. But Baidu has already made considerable headway. The company, which also runs the country's largest search engine, has invested in AI for years, releasing its latest AI model last month. Baidu is only one of many companies working to decode animal communication using AI. For instance, California-based nonprofit Earth Species Project has been attempting to build an AI-based system that can translate birdsong, the whistles of dolphins, and the rumblings of elephants. A separate nonprofit called NatureLM recently announced that it secured $17 million in grants to create language models that can identify the ways animals communicate with each other. Researchers have also attempted to use machine learning to understand the vocalizations of crows and monkeys. While a direct animal translation tool is more than likely still many years out, some scientists have claimed early successes. Last year, a team of scientists from SETI (Search for Extraterrestrial Intelligence) claimed to have "conversed" with a humpback whale in Alaska. "The things we learn from communicating with whales could help us when it comes time to connect with aliens," SETI researcher and University of California Davis animal behavioralist Josie Hubbard told the New York Post at the time. More on AI translation: World's Largest Call Center Deploys AI to "Neutralize the Accent" of Indian Employees