Towards SPACE THEA

Social robots are robots that come close to animals and humans, interact and communicate with them. They reproduce characteristics of animals and humans in their behavior and appearance. They can be implemented both as hardware robots and as software robots. The SPACE THEA project should have already started in March 2020. Because of COVID-19 it had to be postponed. Now the master student and his supervisor, Prof. Dr. Oliver Bendel (both School of Business FHNW), start with the preparatory work. In winter 2020/2021 and spring 2021 the programming of the voicebot is then carried out. SPACE THEA is designed to accompany astronauts to Mars and to show them empathy and emotions. In the best case, she should also be able to provide psychological counseling, for example, based on cases from the literature. The project will use findings from social robotics, but also from machine ethics. The results will be available by summer 2021.

A Spider that Reads the Whole Web

Diffbot, a Stanford startup, is building an AI-based spider that reads as many pages as possible on the entire public web, and extracts as many facts from those pages as it can. “Like GPT-3, Diffbot’s system learns by vacuuming up vast amounts of human-written text found online. But instead of using that data to train a language model, Diffbot turns what it reads into a series of three-part factoids that relate one thing to another: subject, verb, object.” (MIT Technology Review, 4 September 2020) Knowledge graphs – which is what this is all about – have been around for a long time. However, they have been created mostly manually or only with regard to certain areas. Some years ago, Google started using knowledge graphs too. Instead of giving us a list of links to pages about Spider-Man, the service gives us a set of facts about him drawn from its knowledge graph. But it only does this for its most popular search terms. According to MIT Technology Review, the startup wants to do it for everything. “By fully automating the construction process, Diffbot has been able to build what may be the largest knowledge graph ever.” (MIT Technology Review, 4 September 2020) Diffbot’s AI-based spider reads the web as we read it and sees the same facts that we see. Even if it does not really understand what it sees – we will be amazed at the results.

Simple, Soft Social Robots

With Hugvie and Somnox Sleep Robot, researchers and companies has made it clear that it is possible to build simple, soft social robots that have a certain utility and impact. This raises hopes for social robotics, which is currently showing some progress, but is still developing slowly. Materials such as rubber and plastic can be used to make simple, soft social robots. These materials can be combined with existing devices such as smartphones and tablets on which you run certain applications, or with simple sensors and electronic components. The project or thesis, announced by Oliver Bendel in August 2020 at the School of Business FHNW, will first do research on the basics of simple, soft social robots. The work of Hiroshi Ishiguro and Alexis Block (with Katherine J. Kuchenbecker) will be included. Then, examples of implementation forms are mentioned and sketched. Their purpose and benefits are presented, as well as possible areas of application. One example is to be implemented, whereby speech abilities and sounds can be an option as well as vibration and electrical impulses. The reference to applications in the household, in public space or in the commercial sector should be established. The project will start in February 2021.

Machine Dance

Which moves go with which song? Should I do the Floss, the Dougie or the Robot? Or should I create a new style? But which one? An AI system could help answer these questions in the future. At least the announcement of a social media platform raises this hope: “Facebook AI researchers have developed a system that enables a machine to generate a dance for any input music. It’s not just imitating human dance movements; it’s creating completely original, highly creative routines. That’s because it uses finely tuned search procedures to stay synchronized and surprising, the two main criteria of a creative dance. Human evaluators say that the AI’s dances are more creative and inspiring than meaningful baselines.” (Website FB) The AI system could inspire dancers when they get stuck and help them to constantly improve. More information via about.fb.com/news/2020/08/ai-dancing-facebook-research/.

Towards Robot Enhancement

Social robots and service robots usually have a defined locomotor system, a defined appearance and defined mimic and gestural abilities. This leads, on the one hand, to a certain familiarization effect. On the other hand, the actions of the robots are thus limited, for example in the household or in a shopping mall. Robot enhancement is used to extend and improve social robots and service robots. It changes their appearance and expands their scope. It is possible to apply attachments to the hardware, extend limbs and exchange components. One can pull skin made of silicone over the face or head, making the robots look humanoid. One can also change the software and connect the robot to AI systems – this is already done many times. The project or thesis, announced by Oliver Bendel in August 2020 at the School of Business FHNW, should first present the principles and functional possibilities of robot enhancement. Second, concrete examples should be given and described. One of these examples, e.g., the skin made of silicone, has to be implemented. Robots like Pepper or Atlas would be completely changed by such a skin. They could look uncanny, but also appealing. The project will start in September 2020.

Launch of the Interactive AI Magazine

AAAI has announced the launch of the Interactive AI Magazine. According to the organization, the new platform provides online access to articles and columns from AI Magazine, as well as news and articles from AI Topics and other materials from AAAI. “Interactive AI Magazine is a work in progress. We plan to add lot more content on the ecosystem of AI beyond the technical progress represented by the AAAI conference, such as AI applications, AI industry, education in AI, AI ethics, and AI and society, as well as conference calendars and reports, honors and awards, classifieds, obituaries, etc. We also plan to add multimedia such as blogs and podcasts, and make the website more interactive, for example, by enabling commentary on posted articles. We hope that over time Interactive AI Magazine will become both an important source of information on AI and an online forum for conversations among the AI community.” (AAAI Press Release) More information via interactiveaimag.org.

Next ROBOPHILOSOPHY in Helsinki

One of the world’s most important conferences for robot philosophy (aka robophilosophy) and social robotics, ROBOPHILOSOPHY, took place from 18 to 21 August 2020, not in Aarhus (Denmark) as originally planned, but – due to the COVID 19 pandemic – in virtual form. Organizers and presenters were Marco Nørskov and Johanna Seibt. A considerable number of the lectures were devoted to machine ethics, such as “Moral Machines” (Aleksandra Kornienko), “Permissibility-Under-a-Description Reasoning for Deontological Robots” (Felix Lindner) and “The Morality Menu Project” (Oliver Bendel). The keynotes were given by Selma Šabanović (Indiana University Bloomington), Robert Sparrow (Monash University), Shannon Vallor (The University of Edinburgh), Alan Winfield (University of the West of England), Aimee van Wynsberghe (Delft University of Technology) and John Danaher (National University of Ireland). In his outstanding presentation, Winfield was sceptical about moral machines, whereupon Bendel made it clear in the discussion that they are useful in some areas and dangerous in others, and emphasized the importance of machine ethics for the study of machine and human morality, a point with which Winfield again agreed. The last conference was held in Vienna in 2018. Keynote speakers at that time included Hiroshi Ishiguro, Joanna Bryson and Oliver Bendel. The next ROBOPHILOSOPHY will probably take place in 2022 at the University of Helsinki, as the organisers announced at the end of the event.

3D Printing a Robotic Finger

Nao and Pepper have perfectly shaped hands and fingers. But the fingers are now facing serious competition. Scientists at University of California- Santa Cruz and Ritsumeikan University in Japan have designed and produced a robotic finger inspired by the human endoskeletal structure. From the abstract of the paper “3D Printing an Assembled Biomimetic Robotic Finger”: “We present a novel approach for fabricating cable-driven robotic systems. Particularly, we show that a biomimetic finger featuring accurate bone geometry, ligament structures, and viscoelastic tendons can be synthesized as a single part using a mutli-material 3D printer. This fabrication method eliminates the need to engineer an interface between the rigid skeletal structure and elastic tendon system. The artificial muscles required to drive the printed tendons of the finger can also be printed in place.” The biomimetic robotic finger was presented at the 17th International Conference on Ubiquitous Robots (UR). The paper is available here.

The MOML Project

In many cases it is important that an autonomous system acts and reacts adequately from a moral point of view. There are some artifacts of machine ethics, e.g., GOODBOT or LADYBIRD by Oliver Bendel or Nao as a care robot by Susan Leigh and Michael Anderson. But there is no standardization in the field of moral machines yet. The MOML project, initiated by Oliver Bendel, is trying to work in this direction. In the management summary of his bachelor thesis Simon Giller writes: “We present a literature review in the areas of machine ethics and markup languages which shaped the proposed morality markup language (MOML). To overcome the most substantial problem of varying moral concepts, MOML uses the idea of the morality menu. The menu lets humans define moral rules and transfer them to an autonomous system to create a proxy morality. Analysing MOML excerpts allowed us to develop an XML schema which we then tested in a test scenario. The outcome is an XML based morality markup language for autonomous agents. Future projects can use this language or extend it. Using the schema, anyone can write MOML documents and validate them. Finally, we discuss new opportunities, applications and concerns related to the use of MOML. Future work could develop a controlled vocabulary or an ontology defining terms and commands for MOML.” The bachelor thesis will be publicly available in autumn 2020. It was supervised by Dr. Elzbieta Pustulka. There will also be a paper with the results next year.

Findings on Robotic Hugging

In the first part of the HUGGIE project initiated by Oliver Bendel, two students of the School of Business FHNW conducted an online survey with almost 300 participants. In the management summary of their bachelor thesis Ümmühan Korucu and Leonie Stocker (formerly Leonie Brogle) write: “The results of the survey indicated that people have a positive attitude towards robots in general as robots are perceived as interesting and useful rather than unnecessary and disturbing. However, only a minority of the participants stated that they would accept a hug from a robot. A possible reason for this could be that for the majority of participants, a hug is an act of intimacy with a deeper meaning attached to it which is only being shared with selected persons. For a robot to be perceived as an attractive hugging partner, a human-like design including a face, eyes, a friendly look as well as the ability to communicate verbally and non-verbally is desired. However, an appearance being too realistic has a deterrent effect. Furthermore, an in-depth analysis of the data in relation to age and gender of the participants resulted in the discovery of interesting facts and differences. Overall, the findings contribute to a clearer picture about the appearance and the features Huggie should have in order to be accepted as a hugging counterpart.” The bachelor thesis will be publicly available in autumn 2020. There will also be a paper with the results next year.