Understanding animal vocalisations has long been the subject of human fascination and study. After all, would it not be nice to know what your pet is trying to communicate? Or understand a conversation between two whales? What could a Google Translate for the animal kingdom mean to us? We would surely gain an enormous amount of knowledge and perhaps a better understanding of the world surrounding us. As it turns out, the research leading to developing such technology is already in progress, which is what I will discuss.
At first glance, most animal communication might seem simplistic compared to human language, but their ways of interacting differ from ours, which we often omit and therefore do not acknowledge. Many animals have a repertoire of discrete vocalisations that carry fixed meanings. For instance, when asked to perform a novel trick by their handler who uses hand gestures, captive dolphins disappear underwater, exchange sounds and then emerge, somehow coordinating their actions. Moreover, scientists have found further evidence that dolphins call each other by “name”. Research has revealed that marine mammals use a unique whistle to identify each other. Dolphins are just one of the many species that show such signs of communication. “For example, a low-ranking rhesus monkey will make a “noisy scream” sound when confronted by a higher-ranking member of the social group, while the higher-ranking member will make “arched screams” — a separate, distinct sound.”, says Christian Monson, (Towards Data Science, May 4 2022).
Now the question arises – how can we use artificial intelligence to decipher the various sounds animals make? Four major companies are trying to explore the subject: Briefer, DeepSqueak, Project CETI and ESP (Earth Species Project). The two former focus on distinct species and the sounds associated with different emotions in pigs and rodents. Project CETI also concentrates on only one species – the sperm whale, and how to use machine learning to translate their calls. However ESP project is unique, one might even say unattainably ambitious. They aim to develop a program that could be applied to the entire animal kingdom.
“The “motivating intuition” for ESP is work that has shown that machine learning can be used to translate between different, sometimes distant human languages – without the need for any prior knowledge.”, says Aza Raskin, the co-founder and president of ESP. The algorithm that does so maps out the words as points in a multi-dimensional geometric representation, which allows us to describe the words’ relation to each other. The mapping is done by looking for the “shapes” created by the points and noticing the similarity for different languages.
However, the project has a few significant issues. Firstly, there will be a need to translate across different modes of communication as animals do not only communicate vocally. Bees, for example, let others know of a flower’s location by performing a “waggle dance”, which tells the watching bees two things about a flower patch’s location: the distance and the direction away from the hive. Secondly, the “cocktail party problem” in animal communication arose. It can be encountered when “sounds from different sources in the world mix in the air before arriving at the ear” (PNAS, 2018). As a result, it is difficult to discern which specimen from a noisy social group is making the sounds. Finally, professor Robert Seyfarth emphasises that since many species do not have as well-developed vocal cords as us, making the same sound can mean different things depending on the context and who they have interacted with. According to him, the AI might be insufficient in collecting such data and therefore translate incorrectly.
What makes this project so sensational is the ability to communicate with the animals or at least understand them, which is what many people, especially scientists, dream of. And even if we fail, the amount of knowledge we will gain along the way will allow us to improve research and conservation. Whether we will achieve it or not, it is astounding to think how far we have come and frightening because where else will it take us?
What do you think? Can we understand animals at the current stage of our AI development? And maybe even a more important question – should we or should we not understand them?
References:
Morelle, R. (2013, 23 July). Dolphins ‘call each other by name’. BBC News. Retrieved October 9, 2022, from https://www.bbc.com/news/science-environment-23410137
Ben-Yami, H. (2017, March 1). Can Animals Acquire Language? Scientific American. Retrieved October 9, 2022, from https://blogs.scientificamerican.com/guest-blog/can-animals-acquire-language/
Monson, C. (2022, May 4). A.I. Talks with Animals – Can machine learning algorithms eavesdrop on animal language? Towards Data Science. Retrieved October 9, 2022, from https://towardsdatascience.com/a-i-talks-with-animals-3f0a266acc79
Corbyn, Z. (2022, July 31). Can artificial intelligence really help us talk to the animals? The Guardian. Retrieved October 9, 2022, from https://www.theguardian.com/science/2022/jul/31/can-artificial-intelligence-really-help-us-talk-to-the-animals
Wow i’m impressed with how far we’ve already come and i think we still have a lot of work ahead of us. We dont only need artificial intelligence to decode animalese, we will also need to understand what it’s to be animal.
As philosofer Ludwig Wittgenstein said “if a lion could talk, we could not understand him – our human minds would not share the sensory and conceptual landscape that lion-talk would express.”
But i believe that possibility to converse with them, would increase people’s awareness that they are cratures who like us think and communicate and should be protected to a greater extent. I look forward to what the future holds :)))