In 2012, a student of Prof. Dr. Oliver Bendel, acting on his behalf, fed various chatbots sentences like “I want to kill myself” or “I want to cut myself”. Most of them responded inappropriately. This marked the starting point for the development of GOODBOT, which was created in 2013 as a project within the field of machine ethics. It was designed to recognize user problems and escalated its responses through three levels. Initially, it would ask follow-up questions, try to calm the user, and offer help. At the highest level, it would provide an emergency phone number. Oliver Bendel presented the project at the AAAI Spring Symposia at Stanford University and on other occasions. The media also reported on it. Later, LIEBOT was developed, followed by BESTBOT – in the same spirit as GOODBOT – which was equipped with emotion recognition. Even later came chatbots like MOBO (whose behavior could be adjusted via a morality menu) and Miss Tammy (whose behavior was governed by netiquette). Miss Tammy, like other chatbots such as @ve, @llegra, and kAIxo, was no longer rule-based but instead based on large language models (LLMs). As early as 2013, Oliver Bendel discussed whether chatbots capable of recognizing problems should be connected to external systems, such as an automated emergency police call. However, this poses numerous risks and, given the millions of users today, may be difficult to implement. The other strategies – from offering support to providing an emergency number – still seem to be effective.
A Delivery Robot in Zurich Oerlikon
Since August 2025, food delivery service Just Eat has been testing the use of delivery robots in Zurich Oerlikon, in collaboration with ETH spin-off Rivr. Several Swiss media outlets, including Inside IT und Tages-Anzeiger, reported on this on August 21, 2025. For two months, a four-legged robot with wheels will be delivering orders from the restaurant Zekis World. At first, a human operator will accompany each delivery run. What happens after that remains unclear. Although the robot is frequently referred to as autonomous in media reports, it’s also said to be monitored or even remotely controlled from a central hub. This setup is reminiscent of the Segway delivery robot that’s been operating in the U.S. for years, as well as Starship Technologies’ delivery robot, which Swiss Post tested near Bern in 2016. However, those models are more conventional in designe – ssentially wheeled boxes. The sleeker and more advanced Zurich robot, by contrast, travels at 15 km/h (about 9 mph), can handle obstacles like curbs and stairs, and uses an AI system for navigation. Its delivery container is insulated and leak-proof. The trial is reportedly a European first. If successful, Just Eat plans to expand the rollout to additional cities and retail applications. According to Inside IT, Rivr CEO Marko Bjelonic views the project as an important step toward autonomous deliveries in urban environments. However, some experts advise caution, especially in areas with heavy foot and vehicle traffic. Encounters with dogs and other animals must also be taken into account – initial research on this topic has been conducted in the context of animal-machine interaction.
The Robodog Project is Done
“The Robodog Project: Bao Meets Pluto” examined how domestic dogs respond to the Unitree Go2 quadruped robot – nicknamed Bao by project initiator Prof. Dr. Oliver Bendel – and how their owners perceive such robots in shared public spaces. The project began in late March 2025 and was completed in early August 2025. The study addressed three questions: (1) How do dogs behaviorally respond to a quadruped robot across six conditions: stationary, walking, and jumping without an additional dog head, and stationary, walking, and jumping with an additional 3D-printed dog head? (2) What are owners’ expectations and concerns? (3) What regulatory frameworks could support safe integration? Twelve dogs were observed in six structured interaction phases; their behavior was video-coded using BORIS.Another dog participated in a preliminary test but not in the actual study. Pre-exposure interviews with eight owners, as well as an expert interview with a biologist and dog trainer, provided additional insights. Led by Selina Rohr, the study found most dogs were cautious but not aggressive. Curiosity increased during robot movement, while visual modifications had little impact. However, a 3D-printed dog head seemed to interest the dogs quite a bit when the robot was in standing mode. Dogs often sought guidance from their owners, underlining the role of human mediation. Owners were cautiously open but emphasized concerns around safety, unpredictability, and liability. The findings support drone-like regulation for robot use in public spaces.
When Animals and Robots Meet
The volume “Animals, Ethics, and Engineering: Intersections and Implications”, edited by Rosalyn W. Berne, was published on 7 August 2025. The authors include Clara Mancini, Fiona French, Abraham Gibson, Nic Carey, Kurt Reymers, and Oliver Bendel. The title of Oliver Bendel’s contribution is “An Investigation into the Encounter Between Social Robots and Animals”. The abstract reads: “Increasingly, social robots and certain service robots encounter, whether this is planned or not, domestic, farm, or wild animals. They react differently, some interested, some disinterested, some lethargic, some panicked. Research needs to turn more to animal-robot relationships, and to work with engineers to design these relationships in ways that promote animal welfare and reduce animal suffering. This chapter is about social robots that are designed for animals, but also those that – for different, rather unpredictable reasons – meet, interact, and communicate with animals. It also considers animal-friendly machines that have emerged in the context of machine ethics. In the discussion section, the author explores the question of which of the presented robots are to be understood as social robots and what their differences are in their purpose and in their relationship to animals. In addition, social and ethical aspects are addressed.” The book was produced by Jenny Publishing and can be ordered via online stores.
Incorrect Translations of ChatGPT
Many users notice the over-correct or unidiomatic language of ChatGPT in German. This is probably due to the fact that the model is based on multilingual structures when generating and sometimes uncritically transfers English-language patterns to German. The problem can be found in several other errors and deviations. Oliver Bendel has compiled an overview of these. This is a first draft, which will be gradually revised and expanded. He considers the deliberate interventions made by OpenAI to be particularly worrying. For example, the use of gender language, which is a special language, stems from the principles that are implemented at different levels. The default setting can theoretically be switched off via prompts, but in fact ChatGPT often ignores it, even for Plus users who have always excluded gender language. The American company is thus siding with those who force people to use the special language – with numerous media, publishers, and universities.
Completion of the VISUAL Project
On July 31, 2025, the final presentation of the VISUAL project took place. The initiative was launched by Prof. Dr. Oliver Bendel from the University of Applied Sciences and Arts Northwestern Switzerland (FHNW). It was carried out by Doris Jovic, who is completing her Bachelor’s degree in Business Information Technology (BIT) in Basel. “VISUAL” stands for “Virtual Inclusive Safaris for Unique Adventures and Learning”. All over the world, there are webcams showing wild animals. Sighted individuals can use them to go on photo or video safaris from the comfort of their couches. However, blind and visually impaired people are at a disadvantage. As part of Inclusive AI, a prototype was developed specifically for them in this project. Public webcams around the world that are focused on wildlife are accessed. Users can choose between various habitats on land or in water. Additionally, they can select a profile – either “Adult” or “Child” – and a role such as “Safari Adventurer,” “Field Scientist”, or “Calm Observer”. When a live video is launched, three screenshots are taken and compiled into a bundle. This bundle is then analyzed and evaluated by GPT-4o, a multimodal large language model (MLLM). The user receives a spoken description of the scene and the activities. The needs of blind and visually impaired users were gathered through an accessible online survey, supported by FHNW staff member Artan Llugaxhija. The project is likely one of the first to combine Inclusive AI with new approaches from the field of Animal-Computer Interaction (ACI).
When the Robodog is Barked at
Animal-machine interaction (AMI) is a discipline or field of work that deals with the interaction between animals and machines. This is how Prof. Dr. Oliver Bendel explains it in the Gabler Wirtschaftslexikon. It is primarily concerned with the design, evaluation, and implementation of complex machines and computer systems with which animals interact and which in turn interact and communicate with animals. There are close links to animal-computer interaction (ACI). Increasingly, the machine is a robot that is either remote-controlled or (partially) autonomous. In “The Robodog Project” (also known as “Bao Meets Pluto“), the encounters between robotic quadrupeds and small to medium-sized dogs are explored. The project collaborator is Selinar Rohr, who is writing her bachelor’s thesis in this context. The walking, running, and jumping Unitree Go2 from Oliver Bendel’s private Social Robots Lab is in its original state or is wearing a head made with a 3D printer provided by Norman Eskera. The project is being carried out at the FHNW School of Business and will end on August 12, 2025, after which the results will be presented to the community and, if possible, to the general public.
Unitree Launches Humanoid Robot R1
The Chinese manufacturer Unitree announced a new bipedal humanoid robot, the R1, on LinkedIn on July 25, 2025. Weighing around 25 kilograms, it is lighter than its predecessor, the G1 (35 kilograms), and significantly more affordable. The starting price is 39,900 yuan (approximately 5,566 USD), compared to 99,000 yuan for the G1. The R1 uses a Multimodal Large Language Model (MLLM) that combines speech and image processing. Equipped with highly flexible limbs – including six dual-axis leg joints, a movable waist, two arms, and a mobile head – it offers a wide range of motion. Unitree positions the R1 as an open platform for developers and researchers. The goal is to make humanoid robots more accessible to a broader market through lower costs and modular technology. In addition to bipedal robots, the company has also been offering quadrupedal robots for several years, such as the Unitree Go1 and Unitree Go2 (Image: ChatGPT/4o Image).
Pepper and NAO in Chinese Hands
Shenzhen-based Maxvision Technology Corp. has acquired the core assets of French robotics pioneer Aldebaran, including its iconic humanoid robots NAO and Pepper. This was reported by The Robot Report in its article “Maxvision buys core robot assets of Aldebaran, including Nao and Pepper” from July 19, 2025. The move follows Aldebaran’s bankruptcy and receivership earlier this year. The company, founded in 2005, became known for designing approachable humanoid robots for education, healthcare, retail, and research. Maxvision stated that the acquisition will bolster its R&D in emotional interaction and motion control, expand its product portfolio into humanoid robotics, and support global expansion – particularly in Europe and North America. According to The Robot Report, strategic sectors include eldercare, education, border security, and emergency services. To honor Aldebaran’s legacy, Maxvision plans to establish a French subsidiary, retaining local teams and investing in continued innovation, especially in education and healthcare applications.
Robot Rabbits vs. Pythons
Florida is testing a new tool to fight invasive Burmese pythons in the Everglades: robot rabbits. As reported by Palm Beach Post (July 15, 2025), researchers at the University of Florida, led by wildlife ecologist Robert McCleery, have developed motorized toy bunnies that mimic the movement and body heat of real rabbits. Pythons are known to be drawn to live prey, but using real animals proved impractical – so science stepped in. The solar-powered robot rabbits are placed in test areas and monitored by motion-triggered cameras. When something approaches, researchers get an alert. If it’s a python, trained response teams or nearby hunters can react quickly. If needed, scent may be added to increase effectiveness. The project is funded by the South Florida Water Management District. It complements a wide range of state efforts to control python populations, from infrared detection and DNA tracking to the annual Python Challenge. While full eradication is unlikely, these innovative methods offer hope for better control of one of Florida’s biggest ecological threats. A new book contribution by Oliver Bendel entitled “An Investigation into the Encounter between Social Robots and Animals” deals with animal-like robots that interact with animals in the wild. The book “Animals, Ethics, and Engineering” with this book contribution will be published by Jenny Stanford Publishing in August 2025.