Recent advancements in artificial intelligence (AI) and bioacoustics have opened a unique opportunity to explore and decode animal communication. With the growing availability of bioacoustic data and sophisticated machine learning models, researchers are now in a position to make significant strides in understanding non-human animal languages. However, realizing this potential requires a deliberate integration of AI and ethology. The AI for Non-Human Animal Communication workshop at NeurIPS 2025 will focus on the challenges of processing complex bioacoustic data and interpreting animal signals. The workshop will feature keynote talks, a poster session, and a panel discussion, all aimed at advancing the use of AI to uncover the mysteries of animal communication and its implications for biodiversity and ecological conservation. The workshop is inviting submissions for short papers and proposals related to the use of AI in animal communication. Topics of interest include bioacoustics, multimodal learning, ecological monitoring, species-specific studies, and the ethical considerations of applying AI in animal research. Papers should present novel research, methodologies, or technologies in these areas, and will undergo a double-blind review process. The paper submission deadline is September 5, 2025, with notifications of acceptance by September 22, 2025. More information is available at aiforanimalcomms.org.
The DEEP VOICE Project
The DEEP VOICE project will be launched at the FHNW School of Business in early September 2025. It was initiated by Prof. Dr. Oliver Bendel. “DEEP VOICE” stands for “Decoding Environmental and Ethological Patterns in Vocal Communication of Cetaceans”. The project aims to decode symbolic forms of communication in animals, especially whales. It is based on the conviction that animal communication should not be interpreted from a human perspective, but understood in the context of the species-specific environment. The focus is therefore on developing an AI model that is trained on the basis of a comprehensive environmental and behavioral model of the respective animal. By integrating bioacoustic data, ecological parameters, and social dynamics, the aim is to create an animal-centered translation approach that allows the identification of meaning carriers in animal vocalizations without distorting them anthropocentrically. The project combines modern AI methods with ethological and ecological foundations and thus aims to contribute to a better understanding of non-human intelligence and communication culture and to animal-computer interaction. Oliver Bendel and his students have so far focused primarily on the body language of domestic and farm animals (The Animal Whisperer Project) and the behavior of domestic (The Robodog Project) and wild animals (VISUAL).
When Animals and Robots Meet
The volume “Animals, Ethics, and Engineering: Intersections and Implications”, edited by Rosalyn W. Berne, was published on 7 August 2025. The authors include Clara Mancini, Fiona French, Abraham Gibson, Nic Carey, Kurt Reymers, and Oliver Bendel. The title of Oliver Bendel’s contribution is “An Investigation into the Encounter Between Social Robots and Animals”. The abstract reads: “Increasingly, social robots and certain service robots encounter, whether this is planned or not, domestic, farm, or wild animals. They react differently, some interested, some disinterested, some lethargic, some panicked. Research needs to turn more to animal-robot relationships, and to work with engineers to design these relationships in ways that promote animal welfare and reduce animal suffering. This chapter is about social robots that are designed for animals, but also those that – for different, rather unpredictable reasons – meet, interact, and communicate with animals. It also considers animal-friendly machines that have emerged in the context of machine ethics. In the discussion section, the author explores the question of which of the presented robots are to be understood as social robots and what their differences are in their purpose and in their relationship to animals. In addition, social and ethical aspects are addressed.” The book was produced by Jenny Publishing and can be ordered via online stores.
Completion of the VISUAL Project
On July 31, 2025, the final presentation of the VISUAL project took place. The initiative was launched by Prof. Dr. Oliver Bendel from the University of Applied Sciences and Arts Northwestern Switzerland (FHNW). It was carried out by Doris Jovic, who is completing her Bachelor’s degree in Business Information Technology (BIT) in Basel. “VISUAL” stands for “Virtual Inclusive Safaris for Unique Adventures and Learning”. All over the world, there are webcams showing wild animals. Sighted individuals can use them to go on photo or video safaris from the comfort of their couches. However, blind and visually impaired people are at a disadvantage. As part of Inclusive AI, a prototype was developed specifically for them in this project. Public webcams around the world that are focused on wildlife are accessed. Users can choose between various habitats on land or in water. Additionally, they can select a profile – either “Adult” or “Child” – and a role such as “Safari Adventurer,” “Field Scientist”, or “Calm Observer”. When a live video is launched, three screenshots are taken and compiled into a bundle. This bundle is then analyzed and evaluated by GPT-4o, a multimodal large language model (MLLM). The user receives a spoken description of the scene and the activities. The needs of blind and visually impaired users were gathered through an accessible online survey, supported by FHNW staff member Artan Llugaxhija. The project is likely one of the first to combine Inclusive AI with new approaches from the field of Animal-Computer Interaction (ACI).
Robot Rabbits vs. Pythons
Florida is testing a new tool to fight invasive Burmese pythons in the Everglades: robot rabbits. As reported by Palm Beach Post (July 15, 2025), researchers at the University of Florida, led by wildlife ecologist Robert McCleery, have developed motorized toy bunnies that mimic the movement and body heat of real rabbits. Pythons are known to be drawn to live prey, but using real animals proved impractical – so science stepped in. The solar-powered robot rabbits are placed in test areas and monitored by motion-triggered cameras. When something approaches, researchers get an alert. If it’s a python, trained response teams or nearby hunters can react quickly. If needed, scent may be added to increase effectiveness. The project is funded by the South Florida Water Management District. It complements a wide range of state efforts to control python populations, from infrared detection and DNA tracking to the annual Python Challenge. While full eradication is unlikely, these innovative methods offer hope for better control of one of Florida’s biggest ecological threats. A new book contribution by Oliver Bendel entitled “An Investigation into the Encounter between Social Robots and Animals” deals with animal-like robots that interact with animals in the wild. The book “Animals, Ethics, and Engineering” with this book contribution will be published by Jenny Stanford Publishing in August 2025.
Why Animals Can (Still) Outrun Robots
In an article published in Science Robotics in April 2024, Samuel A. Burden and his co-authors explore the question of why animals can outrun robots. In their abstract they write: “Animals are much better at running than robots. The difference in performance arises in the important dimensions of agility, range, and robustness. To understand the underlying causes for this performance gap, we compare natural and artificial technologies in the five subsystems critical for running: power, frame, actuation, sensing, and control. With few exceptions, engineering technologies meet or exceed the performance of their biological counterparts. We conclude that biology’s advantage over engineering arises from better integration of subsystems, and we identify four fundamental obstacles that roboticists must overcome. Toward this goal, we highlight promising research directions that have outsized potential to help future running robots achieve animal-level performance.” (Abstract) The article was published at a time when the market for robotic four-legged friends is exploding. Spot, Unitree Go2 and many others can certainly compete with some animals when it comes to running. But when it comes to suppleness and elegance, further progress is still needed.
ANIFACE: Animal Face Recognition
Facial recognition is a problematic technology, especially when it is used to monitor people. However, it also has potential, for example with regard to the recognition of (individuals of) animals. Prof. Dr. Oliver Bendel had announced the topic “ANIFACE: Animal Face Recognition” at the University of Applied Sciences FHNW in 2021 and left the choice whether it should be about wolves or bears. Ali Yürekkirmaz accepted the assignment and, in his final thesis, designed a system that could be used to identify individual bears in the Alps – without electronic collars or implanted microchips – and initiate appropriate measures. The idea is that appropriate camera and communication systems are available in certain areas. Once a bear is identified, it is determined whether it is considered harmless or dangerous. Then, the relevant agencies or directly the people concerned will be informed. Walkers can be warned about the recordings – but it is also technically possible to protect their privacy. In an expert discussion with a representative of KORA, the student was able to gain important insights into wildlife monitoring and specifically bear monitoring, and with a survey he was able to find out the attitude of parts of the population. Building on the work of Ali Yürekkirmaz, delivered in January 2022, an algorithm for bears could be developed and an ANIFACE system implemented and evaluated in the Alps. A video about the project is available here.
Talking with Animals
We use our natural language, facial expressions and gestures when communicating with our fellow humans. Some of our social robots also have these abilities, and so we can converse with them in the usual way. Many highly evolved animals have a language in which there are sounds and signals that have specific meanings. Some of them – like chimpanzees or gorillas – have mimic and gestural abilities comparable to ours. Britt Selvitelle and Aza Raskin, founders of the Earth Species Project, want to use machine learning to enable communication between humans and animals. Languages, they believe, can be represented not only as geometric structures, but also translated by matching their structures to each other. They say they have started working on whale and dolphin communication. Over time, the focus will broaden to include primates, corvids, and others. It would be important for the two scientists to study not only natural language but also facial expressions, gestures and other movements associated with meaning (they are well aware of this challenge). In addition, there are aspects of animal communication that are inaudible and invisible to humans that would need to be considered. Britt Selvitelle and Aza Raskin believe that translation would open up the world of animals – but it could be the other way around that they would first have to open up the world of animals in order to decode their language. However, should there be breakthroughs in this area, it would be an opportunity for animal welfare. For example, social robots, autonomous cars, wind turbines, and other machines could use animal languages alongside mechanical signals and human commands to instruct, warn and scare away dogs, elks, pigs, and birds. Machine ethics has been developing animal-friendly machines for years. Among other things, the scientists use sensors together with decision trees. Depending on the situation, braking and evasive maneuvers are initiated. Maybe one day the autonomous car will be able to avoid an accident by calling out in deer dialect: Hello deer, go back to the forest!
Dogs Obey Social Robots
The field of animal-machine interaction is gaining new research topics with social robots. Meiying Qin from Yale University and her co-authors have brought together a Nao and a dog. From the abstract of their paper: “In two experiments, we investigate whether dogs respond to a social robot after the robot called their names, and whether dogs follow the ‘sit’ commands given by the robot. We conducted a between-subjects study (n = 34) to compare dogs’ reactions to a social robot with a loudspeaker. Results indicate that dogs gazed at the robot more often after the robot called their names than after the loudspeaker called their names. Dogs followed the ‘sit’ commands more often given by the robot than given by the loudspeaker. The contribution of this study is that it is the first study to provide preliminary evidence that 1) dogs showed positive behaviors to social robots and that 2) social robots could influence dog’s behaviors. This study enhance the understanding of the nature of the social interactions between humans and social robots from the evolutionary approach. Possible explanations for the observed behavior might point toward dogs perceiving robots as agents, the embodiment of the robot creating pressure for socialized responses, or the multimodal (i.e., verbal and visual) cues provided by the robot being more attractive than our control condition.” (Abstract) You can read the full paper via dl.acm.org/doi/abs/10.1145/3371382.3380734.
Imitating the Agile Locomotion Skills of Four-legged Animals
Imitating the agile locomotion skills of animals has been a longstanding challenge in robotics. Manually-designed controllers have been able to reproduce many complex behaviors, but building such controllers is time-consuming and difficult. According to Xue Bin Peng (Google Research and University of California, Berkeley) and his co-authors, reinforcement learning provides an interesting alternative for automating the manual effort involved in the development of controllers. In their work, they present “an imitation learning system that enables legged robots to learn agile locomotion skills by imitating real-world animals” (Xue Bin Peng et al. 2020). They show “that by leveraging reference motion data, a single learning-based approach is able to automatically synthesize controllers for a diverse repertoire behaviors for legged robots” (Xue Bin Peng et al. 2020). By incorporating sample efficient domain adaptation techniques into the training process, their system “is able to learn adaptive policies in simulation that can then be quickly adapted for real-world deployment” (Xue Bin Peng et al. 2020). For demonstration purposes, the scientists trained “a quadruped robot to perform a variety of agile behaviors ranging from different locomotion gaits to dynamic hops and turns” (Xue Bin Peng et al. 2020).