logo
#

Latest news with #WDP

Clicks & growls: Why AI's hearing the call of the wild
Clicks & growls: Why AI's hearing the call of the wild

Mint

time29-04-2025

  • Science
  • Mint

Clicks & growls: Why AI's hearing the call of the wild

Google has used artificial intelligence (AI) to decode and mimic dolphin sounds, advancing our understanding of marine life. But can AI truly outperform human insight in interpreting animal communication? Also Read | Return of Indian tech brands: Is it for real this time? Dolphins are famously socially skilled, intelligent, agile, joyful and playful, thus sharing many emotional similarities with (some) humans. Just as British ethologist Jane Goodall studied the social and family interactions of wild chimpanzees, Denise Herzing has studied dolphin communication in the wild since 1985, making The Wild Dolphin Project (WDP) the longest running underwater dolphin research project in the world. Google, in partnership with Georgia Tech and WDP, used an AI model to analyse vocal patterns much like a language model, identifying structure and predicting sequences. Also Read | Why Pakistan's trade ban is more sound than fury Dolphins use distinct sounds for different social situations: whistles act like names for individual identification, especially between mothers and calves; squawks often occur during conflicts; and click buzzes are used in courtship. DolphinGemma, Google's 400 million parameter model that runs on Pixel6 smartphones, decodes these sounds by combining audio tech with data from WDP acquired by studying wild dolphins in the Bahamas. On National Dolphin Day (14 April), Google showcased advances to its AI model that can now analyse dolphin vocal patterns and generate realistic, dolphin-like sounds. Also Read | Return of the dire wolf: Is this a Game of Clones? AI is being used to detect how parrots, crows, wolves, whales, chimpanzees and octopuses communicate. NatureLM-audio is the first audio-language model built for animal sounds and can generalize to unseen species. Other projects use AI and robotics to decode sperm whale clicks, or listen to elephants to detect possible warning calls and mating signals. It aids conservation by monitoring endangered species. Decoding communication reveals ecosystem health, alerting us to pollution and climate change. It enhances human-animal interactions and fosters empathy. AI, combined with satellite imagery, camera traps and bioacoustics, is being used in Guacamaya to monitor deforestation and protect the Amazon, a collaboration between Universidad de los Andes, Instituto SINCHI, Instituto Humboldt, Planet Labs and Microsoft AI for Good Lab. AI can detect animal sound patterns, but without context— is the animal mating, feeding or in danger?—its interpretations are limited. The risk of humans assuming animals 'talk" like humans do, messy field data, and species-specific behaviours can complicate analysis. AI might identify correlations but not true meaning or intent. Human intuition helps. These systems often require custom models and extensive resources, making large-scale, accurate decoding of animal communication a complex effort.

Google working to decode dolphin communication using AI
Google working to decode dolphin communication using AI

Yahoo

time27-04-2025

  • Science
  • Yahoo

Google working to decode dolphin communication using AI

Cracking the dolphin code. Dolphins are one of the smartest animals on Earth and have been revered for thousands of years for their intelligence, emotions and social interaction with humans. Now Google is using artificial intelligence (AI) to try and understand how they communicate with one another – with the hope that one day humans could use the technology to chat with the friendly finned mammals. Google has teamed up with researchers at Georgia Institute of Technology and the Wild Dolphin Project (WDP), a Florida-based non-profit which has been studying and recording dolphin sounds for 40 years, to build the new AI model called DolphinGemma. 'Super Pod' Of More Than 1,500 Dolphins Off California Coast Captured On Drone Video For decades, WDP has correlated sound types with behavioral contexts. For instance, signature whistles have been used by mothers and calves to reunite, while burst pulse "squawks" are often observed during dolphin fights, researchers said, according to a Google blog on the project. Read On The Fox News App Click "buzzes" are often used during courtship or chasing sharks. Now, using the vast data gathered by WDP, Google has built DolphinGemma, building upon Google's own AI lightweight open model, known as Gemma. DolphinGemma has been trained to analyze the vast library of recordings to detect patterns, structures, and even potential "meanings" behind the dolphin communications or vocalizations. Over time, DolphinGemma will try to organize the dolphin sounds into categories — almost like words, sentences, or expressions in human language. Chinese Humanoid Robot With Eagle-eye Vision And Powerful Ai "By identifying recurring sound patterns, clusters and reliable sequences, the model can help researchers uncover hidden structures and potential meanings within the dolphins' natural communication — a task previously requiring immense human effort," a post on Google about the project reads. "Eventually, these patterns, augmented with synthetic sounds created by the researchers to refer to objects with which the dolphins like to play, may establish a shared vocabulary with the dolphins for interactive communication." DolphinGemma uses Google's Pixel phone technology, specifically the audio recording technology used in Pixel devices, to make clean, high-quality sound recordings of dolphin vocalizations. The Pixel phone technology can separate out dolphin clicks and whistles from background noise like waves, boat engines, or underwater static. That clean audio is critical for AI models like DolphinGemma, because messy, noisy data would confuse the AI, researchers said. Google says it plans to release DolphinGemma as an open model this summer, allowing researchers around the world to use and adapt it. Although it's trained on Atlantic spotted dolphins, the model could also help study other species like bottlenose or spinner dolphins with some fine-tuning, researchers said. "By providing tools like DolphinGemma, we hope to give researchers worldwide the tools to mine their own acoustic datasets, accelerate the search for patterns and collectively deepen our understanding of these intelligent marine mammals," the blog post article source: Google working to decode dolphin communication using AI

Google working to decode dolphin communication using AI
Google working to decode dolphin communication using AI

Fox News

time27-04-2025

  • Science
  • Fox News

Google working to decode dolphin communication using AI

Cracking the dolphin code. Dolphins are one of the smartest animals on Earth and have been revered for thousands of years for their intelligence, emotions and social interaction with humans. Now Google is using artificial intelligence (AI) to try and understand how they communicate with one another – with the hope that one day humans could use the technology to chat with the friendly finned mammals. Google has teamed up with researchers at Georgia Institute of Technology and the Wild Dolphin Project (WDP), a Florida-based non-profit which has been studying and recording dolphin sounds for 40 years, to build the new AI model called DolphinGemma. 'SUPER POD' OF MORE THAN 1,500 DOLPHINS OFF CALIFORNIA COAST CAPTURED ON DRONE VIDEO For decades, WDP has correlated sound types with behavioral contexts. For instance, signature whistles have been used by mothers and calves to reunite, while burst pulse "squawks" are often observed during dolphin fights, researchers said, according to a Google blog on the project. Click "buzzes" are often used during courtship or chasing sharks. Now, using the vast data gathered by WDP, Google has built DolphinGemma, building upon Google's own AI lightweight open model, known as Gemma. DolphinGemma has been trained to analyze the vast library of recordings to detect patterns, structures, and even potential "meanings" behind the dolphin communications or vocalizations. Over time, DolphinGemma will try to organize the dolphin sounds into categories — almost like words, sentences, or expressions in human language. CHINESE HUMANOID ROBOT WITH EAGLE-EYE VISION AND POWERFUL AI "By identifying recurring sound patterns, clusters and reliable sequences, the model can help researchers uncover hidden structures and potential meanings within the dolphins' natural communication — a task previously requiring immense human effort," a post on Google about the project reads. "Eventually, these patterns, augmented with synthetic sounds created by the researchers to refer to objects with which the dolphins like to play, may establish a shared vocabulary with the dolphins for interactive communication." DolphinGemma uses Google's Pixel phone technology, specifically the audio recording technology used in Pixel devices, to make clean, high-quality sound recordings of dolphin vocalizations. The Pixel phone technology can separate out dolphin clicks and whistles from background noise like waves, boat engines, or underwater static. That clean audio is critical for AI models like DolphinGemma, because messy, noisy data would confuse the AI, researchers said. Google says it plans to release DolphinGemma as an open model this summer, allowing researchers around the world to use and adapt it. Although it's trained on Atlantic spotted dolphins, the model could also help study other species like bottlenose or spinner dolphins with some fine-tuning, researchers said. "By providing tools like DolphinGemma, we hope to give researchers worldwide the tools to mine their own acoustic datasets, accelerate the search for patterns and collectively deepen our understanding of these intelligent marine mammals," the blog post reads.

Google's newest AI model is designed to help study dolphin 'speech'
Google's newest AI model is designed to help study dolphin 'speech'

Yahoo

time15-04-2025

  • Science
  • Yahoo

Google's newest AI model is designed to help study dolphin 'speech'

Google's AI research lab, Google DeepMind, says that it has created an AI model that can help decipher dolphin vocalizations, supporting research efforts to better understand how dolphins communicate. The model, called DolphinGemma, was trained using data from the Wild Dolphin Project (WDP), a nonprofit that studies Atlantic spotted dolphins and their behaviors. Built on Google's open Gemma series of models, DolphinGemma, which can generate "dolphin-like" sound sequences, is efficient enough to run on phones, Google says. This summer, WDP plans to use Google's Pixel 9 smartphone to power a platform that can create synthetic dolphin vocalizations and listen to dolphin sounds for a matching "reply." WDP previously was using the Pixel 6 to conduct this work, Google says, and upgrading to the Pixel 9 will enable researchers at the organization to run AI models and template-matching algorithms at the same time, according to Google.

Google made an AI model to talk to dolphins
Google made an AI model to talk to dolphins

Yahoo

time14-04-2025

  • Science
  • Yahoo

Google made an AI model to talk to dolphins

A new large language model AI system may soon allow humans to converse with dolphins. Scheduled to debut in the coming months, researchers will test to see if DolphinGemma and its companion Cetacean Hearing Augmentation Telemetry (CHAT) system can translate and mimic some of the mammal's own complex vocalizations. If successful, the breakthrough may represent the culmination of over four decades' worth of work, documentation, and conservation efforts.. Dolphins are some of the Earth's smartest and most communicative animals. Their social interactions are so complex that researchers at the Wild Dolphin Project (WDP) have spent the last 40 years attempting to decipher them. In the process, WDP has amassed decades' worth of underwater audio and video documenting a single community of Atlantic spotted dolphins in the Bahamas. In the process, they have been able to correlate sounds with behavioral aspects like courtships, unique names, and dolphin squabbles. Experts have long theorized it may be possible for humans to actually communicate with the cetaceans, but lacked technology advanced enough to parse and mimic the species' underwater whistles, clicks, and burst pulses. With the rise of large language models (LLMs), researchers recently wondered if the same principles underlying LLMs could be applied to dolphin interactions. To test this possibility, WDP recently partnered with Google and the Georgia Institute of Technology, supplying engineers with a massive, labeled dataset of dolphin whistles, clicks, and burst pulses for use in LLM training. The result is DolphinGemma, an AI model built using the same technology that runs Google's Gemini systems. DolphinGemma is designed on roughly 400 million parameters to function in essentially the same way as predictive LLMs like ChatGPT—but for dolphins. DolphinGemma first receives and interprets audio inputs, then predicts likely subsequent sounds for recreation. It is next partnered with the CHAT system installed on modified Google Pixel smartphones. CHAT isn't designed to fully translate a dolphin's natural language, but help humans convey and establish a more simplified, shared vocabulary. The plan is to ostensibly teach members of the WDP's Atlantic spotted dolphin community a series of synthetic whistles with their favorite objects such as seagrass, sargassum, and even researchers' scarves. Over time, experts hope that the dolphins will even learn to request desired items when they want to play. There's still a lot of work to be done before humans and dolphins bridge the interspecies communication gap. But with this creative use of LLMs, those underwater conversations are another step closer.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store