When humans come into contact with wildlife, farm animals, and pets, they sometimes run the risk of being injured or killed. They may be attacked by bears, wolves, cows, horses, or dogs. Experts can use an animal’s body language to determine whether or not danger is imminent. Context is also important, such as whether a mother cow is with her calves. The multimodality of large language models enables novel applications. For example, ChatGPT can evaluate images. This ability can be used to interpret the body language of animals, thus using and replacing expert knowledge. Prof. Dr. Oliver Bendel, who has been involved with animal-computer interaction and animal-machine interaction for many years, has initiated a project called “The Animal Whisperer” in this context. The goal is to create a prototype application based on GenAI that can be used to interpret the body language of an animal and avert danger for humans. GPT-4 or an open source language model should be used to create the prototype. It should be augmented with appropriate material, taking into account animals such as bears, wolves, cows, horses, and dogs. Approaches may include fine-tuning or rapid engineering. The project will begin in March 2024 and the results will be available in the summer of the same year (Image: DALL-E 3).
Goodbye Apple Car, Hello GenAI
Apple’s ambitions to enter the automotive business are apparently history. This is reported by Bloomberg. “Apple Inc. is canceling a decadelong effort to build an electric car, according to people with knowledge of the matter, abandoning one of the most ambitious projects in the history of the company.” (Bloomberg, 27 February 2024) Numerous media outlets around the world have picked up the story. The company actually wanted to launch an autonomous electric car on the market. Apple never communicated this publicly, but it was common knowledge. The project as part of the Special Projects Group (SPG) is now to be wound up and the remaining employees are to focus on the area of generative AI in future, where Apple wants to catch up in the coming months. So you could say: goodbye Apple car, hello GenAI.
GenAI for the Blind
At the AAAI 2024 Spring Symposium “Impact of GenAI on Social and Individual Well-being” the paper “How Can Generative AI Enhance the Well-being of the Blind?” by Oliver Bendel was accepted. In his paper, the information systems specialist and technology philosopher from Zurich discusses the GPT-4-based Be My AI feature of the Be My Eyes app. He presents his own tests with the app and discusses it from an ethical perspective. The feature is one of the most important inventions in recent years for blind and visually impaired people. It allows them to describe and categorize their surroundings without outside help. However, it is troubling that the app refuses to show some objects, including famous works of art that depict nudity. This disenfranchises people because of the moral sensitivities and economic considerations of the developers. Oliver Bendel will present the paper at Stanford University on March 25-27. It is his ninth consecutive appearance at the AAAI Spring Symposia, which this time consists of eight symposia on artificial intelligence.