Researchers at Project CETI, Google DeepMind, and university labs deploy machine learning models to analyze structured whale codas, train LLMs on dolphin vocal data, and repurpose speech‐recognition nets for dog barks, pioneering methods for interpreting and responding to diverse animal communications.
Key points
- Project CETI uses ML to analyze 8,000+ sperm whale codas, identifying phonetic‐like features “rubato” and “ornamentation.”
- Google DeepMind’s DolphinGemma LLM, trained on 40 years of dolphin vocalizations, predicts next clicks and generates synthetic dolphin audio for two‐way CHAT interactions.
- University of Michigan repurposes Wav2Vec2 to classify dog barks by emotion, gender, breed, and identity, demonstrating cross‐domain transfer efficacy.
Why it matters: Decoding animal communication with AI could revolutionize ethology by enabling direct interspecies dialogues and deepening our understanding of animal cognition.
Q&A
- What are "codas" in whale communication?
- How does an LLM process dolphin sounds?
- What is transfer learning in animal AI?
- What ethical concerns arise in AI-animal communication?