Mint Primer | Clicks & Growls: Why AI’s Hearing The Call Of The Wildnews24 | News 24
Dark Mode Light Mode
Dark Mode Light Mode

Mint Primer | Clicks & growls: Why AI’s hearing the call of the wildnews24

Google has used artificial intelligence (AI) to decode and mimic dolphin sounds, advancing our understanding of marine life. But can AI truly outperform human insight in interpreting animal communication?

Also Read | Return of Indian tech brands: Is it for real this time?

What’s the dolphin AI model about?

Dolphins are famously socially skilled, intelligent, agile, joyful and playful, thus sharing many emotional similarities with (some) humans. Just as British ethologist Jane Goodall studied the social and family interactions of wild chimpanzees, Denise Herzing has studied dolphin communication in the wild since 1985, making The Wild Dolphin Project (WDP) the longest running underwater dolphin research project in the world. Google, in partnership with Georgia Tech and WDP, used an AI model to analyse vocal patterns much like a language model, identifying structure and predicting sequences.

Also Read | Why Pakistan’s trade ban is more sound than fury

How does the model work?

Dolphins use distinct sounds for different social situations: whistles act like names for individual identification, especially between mothers and calves; squawks often occur during conflicts; and click buzzes are used in courtship. DolphinGemma, Google’s 400 million parameter model that runs on Pixel6 smartphones, decodes these sounds by combining audio tech with data from WDP acquired by studying wild dolphins in the Bahamas. On National Dolphin Day (14 April), Google showcased advances to its AI model that can now analyse dolphin vocal patterns and generate realistic, dolphin-like sounds.

Also Read | Return of the dire wolf: Is this a Game of Clones?

Are there other similar projects that use AI?

AI is being used to detect how parrots, crows, wolves, whales, chimpanzees and octopuses communicate. NatureLM-audio is the first audio-language model built for animal sounds and can generalize to unseen species. Other projects use AI and robotics to decode sperm whale clicks, or listen to elephants to detect possible warning calls and mating signals.

What’s the point of this AI usage?

It aids conservation by monitoring endangered species. Decoding communication reveals ecosystem health, alerting us to pollution and climate change. It enhances human-animal interactions and fosters empathy. AI, combined with satellite imagery, camera traps and bioacoustics, is being used in Guacamaya to monitor deforestation and protect the Amazon, a collaboration between Universidad de los Andes, Instituto SINCHI, Instituto Humboldt, Planet Labs and Microsoft AI for Good Lab.

Are there any limits to these AI models?

AI can detect animal sound patterns, but without context— is the animal mating, feeding or in danger?—its interpretations are limited. The risk of humans assuming animals “talk” like humans do, messy field data, and species-specific behaviours can complicate analysis. AI might identify correlations but not true meaning or intent. Human intuition helps. These systems often require custom models and extensive resources, making large-scale, accurate decoding of animal communication a complex effort.

Add a comment Add a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Previous Post

Arsenal fail to rise to the occasion against PSG in Champions League semi-final first leg defeat | Football Newsnews24

Next Post

Champions League takeaways: PSG tops Arsenal and inches closer to finalnews24