logo
#

Latest news with #TheWildDolphinProject

Clicks & growls: Why AI's hearing the call of the wild
Clicks & growls: Why AI's hearing the call of the wild

Mint

time29-04-2025

  • Science
  • Mint

Clicks & growls: Why AI's hearing the call of the wild

Google has used artificial intelligence (AI) to decode and mimic dolphin sounds, advancing our understanding of marine life. But can AI truly outperform human insight in interpreting animal communication? Also Read | Return of Indian tech brands: Is it for real this time? Dolphins are famously socially skilled, intelligent, agile, joyful and playful, thus sharing many emotional similarities with (some) humans. Just as British ethologist Jane Goodall studied the social and family interactions of wild chimpanzees, Denise Herzing has studied dolphin communication in the wild since 1985, making The Wild Dolphin Project (WDP) the longest running underwater dolphin research project in the world. Google, in partnership with Georgia Tech and WDP, used an AI model to analyse vocal patterns much like a language model, identifying structure and predicting sequences. Also Read | Why Pakistan's trade ban is more sound than fury Dolphins use distinct sounds for different social situations: whistles act like names for individual identification, especially between mothers and calves; squawks often occur during conflicts; and click buzzes are used in courtship. DolphinGemma, Google's 400 million parameter model that runs on Pixel6 smartphones, decodes these sounds by combining audio tech with data from WDP acquired by studying wild dolphins in the Bahamas. On National Dolphin Day (14 April), Google showcased advances to its AI model that can now analyse dolphin vocal patterns and generate realistic, dolphin-like sounds. Also Read | Return of the dire wolf: Is this a Game of Clones? AI is being used to detect how parrots, crows, wolves, whales, chimpanzees and octopuses communicate. NatureLM-audio is the first audio-language model built for animal sounds and can generalize to unseen species. Other projects use AI and robotics to decode sperm whale clicks, or listen to elephants to detect possible warning calls and mating signals. It aids conservation by monitoring endangered species. Decoding communication reveals ecosystem health, alerting us to pollution and climate change. It enhances human-animal interactions and fosters empathy. AI, combined with satellite imagery, camera traps and bioacoustics, is being used in Guacamaya to monitor deforestation and protect the Amazon, a collaboration between Universidad de los Andes, Instituto SINCHI, Instituto Humboldt, Planet Labs and Microsoft AI for Good Lab. AI can detect animal sound patterns, but without context— is the animal mating, feeding or in danger?—its interpretations are limited. The risk of humans assuming animals 'talk" like humans do, messy field data, and species-specific behaviours can complicate analysis. AI might identify correlations but not true meaning or intent. Human intuition helps. These systems often require custom models and extensive resources, making large-scale, accurate decoding of animal communication a complex effort.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store