I recommend to view the video “Decoding Nonhuman Communication | Polylogues – YouTube” to learn more about the fascinating topic of understanding non-human animal communication. This video features experts in the field who discuss the latest research and technology being used to understand how animals communicate with each other. It is a great way to learn about the different ways animals communicate, such as through vocalizations, body language, and even through chemical signals.

It is also important to understand non-human animals because it can help us better appreciate and protect the natural world around us. By understanding how animals communicate and interact with each other, we can better understand the roles they play in their ecosystems and how human actions can impact them. Additionally, understanding non-human animals can also help us to better understand our own communication and behavior as humans.

In August 2020, the Simons Institute for the Theory of Computing hosted a workshop on Decoding Communication in Nonhuman Species. In this episode of Polylogues, Simons Institute Director Shafi Goldwasser sits down with Michael Bronstein (Imperial College London and Twitter), David Gruber (City University of New York), and Lilach Hadany (Tel Aviv University) to discuss how machine learning can help us understand the meanings hidden in the sounds and signals made by the species with whom we share our world. One of those species is the sperm whale, the research subject of Project CETI (Cetacean Translation Initiative), which was officially launched in April 2021.

Sperm whales possess the largest brains of any species on earth, and have complex social lives with strong intergenerational ties, much like humans. They emit complex vocalizations in the form of patterned clicks, with groups of whales developing unique dialects. Project CETI brings together leading researchers in sperm whale field biology, robotics engineering, machine learning, and linguistics to collect vast amounts of communication data at their field station in Dominica, and apply machine learning and natural language processing to decoding these communications. Shafi, Michael and David are key participants in the project, which is funded by The Audacious Project, a collaborative funding initiative housed at TED. Lilach, by contrast, studies acoustic communication in plants. She and collaborators have demonstrated that plants respond to the sounds emitted by pollinators, and are now applying machine-learning techniques to investigating whether plants respond to the sounds other plants emit under specific conditions.

Previous articleElephants and their Emotions: Understanding the Gentle Giants
Next articleThe Surprising Intelligence of Animals
we believe everybody can change the world


Leave a Reply