logo
#

Latest news with #WildDolphinProject

Google made an AI model to talk to dolphins
Google made an AI model to talk to dolphins

Yahoo

time14-04-2025

  • Science
  • Yahoo

Google made an AI model to talk to dolphins

A new large language model AI system may soon allow humans to converse with dolphins. Scheduled to debut in the coming months, researchers will test to see if DolphinGemma and its companion Cetacean Hearing Augmentation Telemetry (CHAT) system can translate and mimic some of the mammal's own complex vocalizations. If successful, the breakthrough may represent the culmination of over four decades' worth of work, documentation, and conservation efforts.. Dolphins are some of the Earth's smartest and most communicative animals. Their social interactions are so complex that researchers at the Wild Dolphin Project (WDP) have spent the last 40 years attempting to decipher them. In the process, WDP has amassed decades' worth of underwater audio and video documenting a single community of Atlantic spotted dolphins in the Bahamas. In the process, they have been able to correlate sounds with behavioral aspects like courtships, unique names, and dolphin squabbles. Experts have long theorized it may be possible for humans to actually communicate with the cetaceans, but lacked technology advanced enough to parse and mimic the species' underwater whistles, clicks, and burst pulses. With the rise of large language models (LLMs), researchers recently wondered if the same principles underlying LLMs could be applied to dolphin interactions. To test this possibility, WDP recently partnered with Google and the Georgia Institute of Technology, supplying engineers with a massive, labeled dataset of dolphin whistles, clicks, and burst pulses for use in LLM training. The result is DolphinGemma, an AI model built using the same technology that runs Google's Gemini systems. DolphinGemma is designed on roughly 400 million parameters to function in essentially the same way as predictive LLMs like ChatGPT—but for dolphins. DolphinGemma first receives and interprets audio inputs, then predicts likely subsequent sounds for recreation. It is next partnered with the CHAT system installed on modified Google Pixel smartphones. CHAT isn't designed to fully translate a dolphin's natural language, but help humans convey and establish a more simplified, shared vocabulary. The plan is to ostensibly teach members of the WDP's Atlantic spotted dolphin community a series of synthetic whistles with their favorite objects such as seagrass, sargassum, and even researchers' scarves. Over time, experts hope that the dolphins will even learn to request desired items when they want to play. There's still a lot of work to be done before humans and dolphins bridge the interspecies communication gap. But with this creative use of LLMs, those underwater conversations are another step closer.

Humans will soon understand dolphins, claim experts
Humans will soon understand dolphins, claim experts

Telegraph

time14-04-2025

  • Science
  • Telegraph

Humans will soon understand dolphins, claim experts

The launch of a new artifical intelligence model has brought humans closer to understanding dolphins, experts claim. Google DeepMind's DolphinGemma is programmed with the world's largest collection of vocalisations from Atlantic spotted dolphins, recorded over several years by the Wild Dolphin Project. It is hoped the recently launched large language model will be able to pick out hidden patterns, potential meanings and even language from the animals' clicks and whistles. Dr Denise Herzing, the founder and research director of the Wild Dolphin Project, said: 'We do not know if animals have words. Dolphins can recognise themselves in the mirror, they use tools, so they're smart – but language is still the last barrier. 'So feeding dolphin sounds into an AI model will give us a really good look at if there are patterns, subtleties that humans can't pick out. 'You're going to understand what priorities they have, what they are talking about. 'The goal would someday be to 'speak dolphin', and we're really trying to crack the code. I've been waiting for this for 40 years.' Dolphins have complex communication, and from birth will squawk, click, and squeak to each other, and even use unique whistles to address individuals by name. Mothers often use specific noises to call their calves back, while fighting dolphins emit burst-pulses, and those courting, or chasing sharks, make buzzing sounds. For decades, researchers have been trying to decode the chatter, but monitoring pods over vast distances has proven too difficult for humans to detect patterns. The new AI is programmed to search through thousands of sounds that have been linked to behaviour to try and find sequences that could indicate words or language. Dr Thad Starner, a Google DeepMind research scientist, said: 'By identifying recurring sound patterns, clusters and reliable sequences, the model can help researchers uncover hidden structures and potential meanings within the dolphins' natural communication – a task previously requiring immense human effort. 'We're not just listening any more. We're beginning to understand the patterns within the sounds, paving the way for a future where the gap between human and dolphin communication might just get a little smaller. 'We can keep on fine tuning the model as we go and hopefully get better and better understanding of what dolphins are producing.' Talking 'in dolphin' The team is hoping that eventually it will be able to synthesise the sounds in order to talk back 'in dolphin' or develop a new shared vocabulary. It invented a device that can play whistles in the water so that dolphins can learn to associate the noise with certain objects. Describing the technique, Dr Starner added: 'Two researchers get into the water with a group of dolphins and Researcher A might have a scarf – a toy that the dolphins want to play with – and Researcher B is going to ask for that scarf. So Researcher B can play a whistle and Researcher A will hand Researcher B that scarf. 'They might pass the scarf back and forth a couple of times, playing that whistle over and over, and the hope is the dolphins who are watching all of this will figure out the social content and can repeat that sound to ask for the scarf. If that happens, then dolphins have mimicked one word in our tiny made-up dolphin language' Researchers will be able to input their own data into DolphinGemma, released by Open Source on Monday, to try and accelerate advancements in the field. Separately, the University of La Laguna in Spain announced this week that it had developed a new AI system for classifying the vocalisations of orcas in real-time. The research project, funded by the Loro Parque Foundation, used more than 75,000 orca sounds, recorded and classified over nearly two decades at Loro Parque, to develop a neural network.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store