From ‘Dr. Doolittle’ to ‘Dr. AI-little’: How Artificial Intelligence is Making it Possible for Us to Talk to Our Furry (and Scaly, and Feathery) Friends

Have you ever wished you could understand what your pet was trying to tell you? Or wondered what the birds singing outside your window are saying to each other? Well, it turns out that scientists and researchers are working hard to make that a reality – and they’re using the power of artificial intelligence (AI) to do it.

You see, in order to promote sustainability and conservation efforts, it’s crucial that we understand what animals need and how they communicate. But unlike humans, animals don’t speak the same language we do. So, scientists are using AI to help decode the languages that animals use to talk to each other.

Think of it like a “Dr. Doolittle device” (but with a cooler name, like “Dr. AI-little”). Instead of just translating our words into a language that animals can understand, this technology would be able to translate the languages of all different animals.

One organization leading the charge in this field is the Earth Species Project (ESP) in California. ESP is a nonprofit group that uses AI to interpret the signals that animals use to communicate with each other. Their goal is to create a closer connection between humans and animals by allowing us to communicate with them in a more meaningful way.

To do this, researchers and scientists are compiling information on animal communication – everything from the sounds of entire ecosystems to the specific calls of individual creatures. ESP then uses this information to create AI programs that can translate animal communication and make it understandable to humans.

It’s not just about talking to your pets, though. By understanding animal languages, we can also learn more about their emotions, behaviors, and needs. For example, did you know that animals laugh too? Studies have shown that dogs, rats, and even hyenas make laughter-like sounds when they’re happy. Imagine how much we could learn about animals if we could understand their jokes!

So, the next time you’re trying to communicate with your cat and they just stare at you blankly, don’t give up hope. Thanks to the advancements in AI, we might soon be able to understand what they’re really trying to say. Who knows, maybe they’ll even teach us a thing or two about sustainability.

But why is this important? Well, according to Karen Bakker, a professor at the University of British Columbia and the author of The Sounds of Life: How Digital Technology is Bringing Us Closer to the Worlds of Animals and Plants, “AI is a phenomenal pattern recognition machine. But that only takes us so far. In order to truly understand animal communication, we do have to link this back to behavior.” In other words, understanding what animals are saying can help us understand their needs and behaviors, and that can help us protect and conserve their habitats.

Scientists are using AI to analyze data sets of both bioacoustics (the recording of individual organisms) and ecoacoustics (the recording of entire ecosystems). ESP has even published a benchmark system called BEANS (the BEnchmark of ANimal Sounds) that uses 10 datasets of various animal communications to establish a baseline for machine learning classification and detection performance.

Researchers are studying communication from a variety of species, including birds, amphibians, primates, elephants, and even insects like honeybees. But cetaceans (whales, dolphins, and other marine mammals) are especially interesting because of their long history and the fact that more of their communication is forced through the acoustic channel since light doesn’t propagate well underwater.

By decoding animal communication, we can do things like:

  • Identify new species, like the new species of blue whale that was discovered in the Indian Ocean in 2021
  • Evaluate the health of the natural environment, like reforested areas of the rainforest in Costa Rica
  • Establish marine animal protection zones, like the “mobile marine protected areas” that were created off the West Coast of the United States.

It’s pretty cool stuff and it’s important too, because as Kay Firth-Butterfield, the World Economic Forum’s head of AI and machine learning said, “Understanding what animals say is the first step to giving other species on the planet ‘a voice’ in conversations on our environment.”

References:

  • Bakker, K. (2021). The Sounds of Life: How Digital Technology is Bringing Us Closer to the Worlds of Animals and Plants. Cambridge University Press.
  • Earth Species Project. (2022). BEANS: The BEnchmark of ANimal Sounds.
  • Firth-Butterfield, K. (2021). Understanding what animals say is the first step to giving them a voice in conservation. World Economic Forum.
  • Nature. (2021). New blue whale population identified in the Indian Ocean.
Previous articleUnlocking the Potential of Legal Personhood for Animal Protection
Next articleBook Review: “The Sounds of Life”
we believe everybody can change the world

Leave a Reply