Recent advancements in artificial intelligence (AI) and bioacoustics have opened a unique opportunity to explore and decode animal communication. With the growing availability of bioacoustic data and sophisticated machine learning models, researchers are now in a position to make significant strides in understanding non-human animal languages. However, realizing this potential requires a deliberate integration of AI and ethology. The AI for Non-Human Animal Communication workshop at NeurIPS 2025 will focus on the challenges of processing complex bioacoustic data and interpreting animal signals. The workshop will feature keynote talks, a poster session, and a panel discussion, all aimed at advancing the use of AI to uncover the mysteries of animal communication and its implications for biodiversity and ecological conservation. The workshop is inviting submissions for short papers and proposals related to the use of AI in animal communication. Topics of interest include bioacoustics, multimodal learning, ecological monitoring, species-specific studies, and the ethical considerations of applying AI in animal research. Papers should present novel research, methodologies, or technologies in these areas, and will undergo a double-blind review process. The paper submission deadline is September 5, 2025, with notifications of acceptance by September 22, 2025. More information is available at aiforanimalcomms.org.
The DEEP VOICE Project
The DEEP VOICE project will be launched at the FHNW School of Business in early September 2025. It was initiated by Prof. Dr. Oliver Bendel. “DEEP VOICE” stands for “Decoding Environmental and Ethological Patterns in Vocal Communication of Cetaceans”. The project aims to decode symbolic forms of communication in animals, especially whales. It is based on the conviction that animal communication should not be interpreted from a human perspective, but understood in the context of the species-specific environment. The focus is therefore on developing an AI model that is trained on the basis of a comprehensive environmental and behavioral model of the respective animal. By integrating bioacoustic data, ecological parameters, and social dynamics, the aim is to create an animal-centered translation approach that allows the identification of meaning carriers in animal vocalizations without distorting them anthropocentrically. The project combines modern AI methods with ethological and ecological foundations and thus aims to contribute to a better understanding of non-human intelligence and communication culture and to animal-computer interaction. Oliver Bendel and his students have so far focused primarily on the body language of domestic and farm animals (The Animal Whisperer Project) and the behavior of domestic (The Robodog Project) and wild animals (VISUAL).
The Robodog Project is Done
“The Robodog Project: Bao Meets Pluto” examined how domestic dogs respond to the Unitree Go2 quadruped robot – nicknamed Bao by project initiator Prof. Dr. Oliver Bendel – and how their owners perceive such robots in shared public spaces. The project began in late March 2025 and was completed in early August 2025. The study addressed three questions: (1) How do dogs behaviorally respond to a quadruped robot across six conditions: stationary, walking, and jumping without an additional dog head, and stationary, walking, and jumping with an additional 3D-printed dog head? (2) What are owners’ expectations and concerns? (3) What regulatory frameworks could support safe integration? Twelve dogs were observed in six structured interaction phases; their behavior was video-coded using BORIS.Another dog participated in a preliminary test but not in the actual study. Pre-exposure interviews with eight owners, as well as an expert interview with a biologist and dog trainer, provided additional insights. Led by Selina Rohr, the study found most dogs were cautious but not aggressive. Curiosity increased during robot movement, while visual modifications had little impact. However, a 3D-printed dog head seemed to interest the dogs quite a bit when the robot was in standing mode. Dogs often sought guidance from their owners, underlining the role of human mediation. Owners were cautiously open but emphasized concerns around safety, unpredictability, and liability. The findings support drone-like regulation for robot use in public spaces.
Completion of the VISUAL Project
On July 31, 2025, the final presentation of the VISUAL project took place. The initiative was launched by Prof. Dr. Oliver Bendel from the University of Applied Sciences and Arts Northwestern Switzerland (FHNW). It was carried out by Doris Jovic, who is completing her Bachelor’s degree in Business Information Technology (BIT) in Basel. “VISUAL” stands for “Virtual Inclusive Safaris for Unique Adventures and Learning”. All over the world, there are webcams showing wild animals. Sighted individuals can use them to go on photo or video safaris from the comfort of their couches. However, blind and visually impaired people are at a disadvantage. As part of Inclusive AI, a prototype was developed specifically for them in this project. Public webcams around the world that are focused on wildlife are accessed. Users can choose between various habitats on land or in water. Additionally, they can select a profile – either “Adult” or “Child” – and a role such as “Safari Adventurer,” “Field Scientist”, or “Calm Observer”. When a live video is launched, three screenshots are taken and compiled into a bundle. This bundle is then analyzed and evaluated by GPT-4o, a multimodal large language model (MLLM). The user receives a spoken description of the scene and the activities. The needs of blind and visually impaired users were gathered through an accessible online survey, supported by FHNW staff member Artan Llugaxhija. The project is likely one of the first to combine Inclusive AI with new approaches from the field of Animal-Computer Interaction (ACI).
When the Robodog is Barked at
Animal-machine interaction (AMI) is a discipline or field of work that deals with the interaction between animals and machines. This is how Prof. Dr. Oliver Bendel explains it in the Gabler Wirtschaftslexikon. It is primarily concerned with the design, evaluation, and implementation of complex machines and computer systems with which animals interact and which in turn interact and communicate with animals. There are close links to animal-computer interaction (ACI). Increasingly, the machine is a robot that is either remote-controlled or (partially) autonomous. In “The Robodog Project” (also known as “Bao Meets Pluto“), the encounters between robotic quadrupeds and small to medium-sized dogs are explored. The project collaborator is Selinar Rohr, who is writing her bachelor’s thesis in this context. The walking, running, and jumping Unitree Go2 from Oliver Bendel’s private Social Robots Lab is in its original state or is wearing a head made with a 3D printer provided by Norman Eskera. The project is being carried out at the FHNW School of Business and will end on August 12, 2025, after which the results will be presented to the community and, if possible, to the general public.
The Robodog Project Starts
Robotic four-legged friends – often referred to as robot dogs – are becoming more and more widespread. As a result, they will also encounter more and more real dogs. The question is how to design, control, and program the robot in such a way that the animals do not overreact and cause no harm to robots, animals, or bystanders. As part of “The Robodog Project”, smaller dogs are to be confronted with a walking, running, and jumping Unitree Go2. The plan is to visit controllable environments such as dog training grounds and arrange meetings with dog owners. The findings will lead to suggestions for design and control. Robot enhancement can also play a role here. For example, hobbyists have produced heads for Unitree Go2 using a 3D printer, giving the robot a completely different look. Suggestions for programming will also be made. The project is due to start at the FHNW School of Business in March 2024. It is part of Prof. Dr. Oliver Bendel’s research in the field of animal-machine interaction.
13 Animal-Related Concepts and Artifacts
Since 2012, Oliver Bendel has developed 13 concepts and artifacts in the field of animal-computer interaction (ACI) or animal-machine interaction (AMI) together with his students. They can be divided into three categories. The first are animal- and nature-friendly concepts. The second are animal-friendly machines and systems (i.e., forms of moral machines). The third are animal-inspired machines and systems that replace the animals or bring them closer to you. Articles and book chapters have been published on many of the projects. The names of the developers can be found in these. A few prototypes made it into the media, such as LADYBIRD and HAPPY HEDGEHOG. Oliver Bendel repeatedly refers to Clara Mancini, the pioneer in the field of animal-computer interaction. Recently, ethicists such as Peter Singer have also turned their attention to the topic.
Award for ACI Paper
“The Animal Whisperer Project” by Oliver Bendel (FHNW School of Business) and Nick Zbinden (FHNW School of Business) won the Honourable Mention Short Paper Award at the 2024 ACI Conference. From the abstract: “Generative AI has become widespread since 2022. Technical advancements have resulted in multimodal large language models and other AI models that generate, analyze, and evaluate texts, images, and sounds. Such capabilities can be helpful in encounters between humans and animals. For example, apps with generative AI on a smartphone can be used to assess the body language and behavior of animals – e.g., during a walk or hike – and provide a recommendation for human behavior. It is often useful to take into account the animal’s environment and situation. The apps can help people to avert approaches and attacks, and thus also protect animals. In ‘The Animal Whisperer Project’, three apps were developed as prototypes based on the multimodal large language model GPT-4 from OpenAI from the beginning to mid-2024. Three specific GPTs resulted: the Cow Whisperer, the Horse Whisperer, and the Dog Whisperer. All three showed impressive capabilities after the first prompt engineering. These were improved by implementing information from expert interviews and adding labeled images of animals and other materials. AI-based apps for interpreting body language, behavior, and the overall situation can apparently be created today, without much effort, in a low-budget project. However, turning them into products would certainly raise questions, such as liability in the event of accidents.” The proceedings are available here.