Machine Dance

Which moves go with which song? Should I do the Floss, the Dougie or the Robot? Or should I create a new style? But which one? An AI system could help answer these questions in the future. At least the announcement of a social media platform raises this hope: “Facebook AI researchers have developed a system that enables a machine to generate a dance for any input music. It’s not just imitating human dance movements; it’s creating completely original, highly creative routines. That’s because it uses finely tuned search procedures to stay synchronized and surprising, the two main criteria of a creative dance. Human evaluators say that the AI’s dances are more creative and inspiring than meaningful baselines.” (Website FB) The AI system could inspire dancers when they get stuck and help them to constantly improve. More information via about.fb.com/news/2020/08/ai-dancing-facebook-research/.

Towards Robot Enhancement

Social robots and service robots usually have a defined locomotor system, a defined appearance and defined mimic and gestural abilities. This leads, on the one hand, to a certain familiarization effect. On the other hand, the actions of the robots are thus limited, for example in the household or in a shopping mall. Robot enhancement is used to extend and improve social robots and service robots. It changes their appearance and expands their scope. It is possible to apply attachments to the hardware, extend limbs and exchange components. One can pull skin made of silicone over the face or head, making the robots look humanoid. One can also change the software and connect the robot to AI systems – this is already done many times. The project or thesis, announced by Oliver Bendel in August 2020 at the School of Business FHNW, should first present the principles and functional possibilities of robot enhancement. Second, concrete examples should be given and described. One of these examples, e.g., the skin made of silicone, has to be implemented. Robots like Pepper or Atlas would be completely changed by such a skin. They could look uncanny, but also appealing. The project will start in September 2020.

Launch of the Interactive AI Magazine

AAAI has announced the launch of the Interactive AI Magazine. According to the organization, the new platform provides online access to articles and columns from AI Magazine, as well as news and articles from AI Topics and other materials from AAAI. “Interactive AI Magazine is a work in progress. We plan to add lot more content on the ecosystem of AI beyond the technical progress represented by the AAAI conference, such as AI applications, AI industry, education in AI, AI ethics, and AI and society, as well as conference calendars and reports, honors and awards, classifieds, obituaries, etc. We also plan to add multimedia such as blogs and podcasts, and make the website more interactive, for example, by enabling commentary on posted articles. We hope that over time Interactive AI Magazine will become both an important source of information on AI and an online forum for conversations among the AI community.” (AAAI Press Release) More information via interactiveaimag.org.

Next ROBOPHILOSOPHY in Helsinki

One of the world’s most important conferences for robot philosophy (aka robophilosophy) and social robotics, ROBOPHILOSOPHY, took place from 18 to 21 August 2020, not in Aarhus (Denmark) as originally planned, but – due to the COVID 19 pandemic – in virtual form. Organizers and presenters were Marco Nørskov and Johanna Seibt. A considerable number of the lectures were devoted to machine ethics, such as “Moral Machines” (Aleksandra Kornienko), “Permissibility-Under-a-Description Reasoning for Deontological Robots” (Felix Lindner) and “The Morality Menu Project” (Oliver Bendel). The keynotes were given by Selma Šabanović (Indiana University Bloomington), Robert Sparrow (Monash University), Shannon Vallor (The University of Edinburgh), Alan Winfield (University of the West of England), Aimee van Wynsberghe (Delft University of Technology) and John Danaher (National University of Ireland). In his outstanding presentation, Winfield was sceptical about moral machines, whereupon Bendel made it clear in the discussion that they are useful in some areas and dangerous in others, and emphasized the importance of machine ethics for the study of machine and human morality, a point with which Winfield again agreed. The last conference was held in Vienna in 2018. Keynote speakers at that time included Hiroshi Ishiguro, Joanna Bryson and Oliver Bendel. The next ROBOPHILOSOPHY will probably take place in 2022 at the University of Helsinki, as the organisers announced at the end of the event.

3D Printing a Robotic Finger

Nao and Pepper have perfectly shaped hands and fingers. But the fingers are now facing serious competition. Scientists at University of California- Santa Cruz and Ritsumeikan University in Japan have designed and produced a robotic finger inspired by the human endoskeletal structure. From the abstract of the paper “3D Printing an Assembled Biomimetic Robotic Finger”: “We present a novel approach for fabricating cable-driven robotic systems. Particularly, we show that a biomimetic finger featuring accurate bone geometry, ligament structures, and viscoelastic tendons can be synthesized as a single part using a mutli-material 3D printer. This fabrication method eliminates the need to engineer an interface between the rigid skeletal structure and elastic tendon system. The artificial muscles required to drive the printed tendons of the finger can also be printed in place.” The biomimetic robotic finger was presented at the 17th International Conference on Ubiquitous Robots (UR). The paper is available here.

The MOML Project

In many cases it is important that an autonomous system acts and reacts adequately from a moral point of view. There are some artifacts of machine ethics, e.g., GOODBOT or LADYBIRD by Oliver Bendel or Nao as a care robot by Susan Leigh and Michael Anderson. But there is no standardization in the field of moral machines yet. The MOML project, initiated by Oliver Bendel, is trying to work in this direction. In the management summary of his bachelor thesis Simon Giller writes: “We present a literature review in the areas of machine ethics and markup languages which shaped the proposed morality markup language (MOML). To overcome the most substantial problem of varying moral concepts, MOML uses the idea of the morality menu. The menu lets humans define moral rules and transfer them to an autonomous system to create a proxy morality. Analysing MOML excerpts allowed us to develop an XML schema which we then tested in a test scenario. The outcome is an XML based morality markup language for autonomous agents. Future projects can use this language or extend it. Using the schema, anyone can write MOML documents and validate them. Finally, we discuss new opportunities, applications and concerns related to the use of MOML. Future work could develop a controlled vocabulary or an ontology defining terms and commands for MOML.” The bachelor thesis will be publicly available in autumn 2020. It was supervised by Dr. Elzbieta Pustulka. There will also be a paper with the results next year.

Findings on Robotic Hugging

In the first part of the HUGGIE project initiated by Oliver Bendel, two students of the School of Business FHNW conducted an online survey with almost 300 participants. In the management summary of their bachelor thesis Ümmühan Korucu and Leonie Stocker (formerly Leonie Brogle) write: “The results of the survey indicated that people have a positive attitude towards robots in general as robots are perceived as interesting and useful rather than unnecessary and disturbing. However, only a minority of the participants stated that they would accept a hug from a robot. A possible reason for this could be that for the majority of participants, a hug is an act of intimacy with a deeper meaning attached to it which is only being shared with selected persons. For a robot to be perceived as an attractive hugging partner, a human-like design including a face, eyes, a friendly look as well as the ability to communicate verbally and non-verbally is desired. However, an appearance being too realistic has a deterrent effect. Furthermore, an in-depth analysis of the data in relation to age and gender of the participants resulted in the discovery of interesting facts and differences. Overall, the findings contribute to a clearer picture about the appearance and the features Huggie should have in order to be accepted as a hugging counterpart.” The bachelor thesis will be publicly available in autumn 2020. There will also be a paper with the results next year.

Ingenuity on Mars

The Perseverance rover, which is on its way to Mars, is carrying a drone called Ingenuity (photo/concept: NASA). According to NASA, it is a technology demonstration to test powered flight on another world for the first time. “A series of flight tests will be performed over a 30-Martian-day experimental window that will begin sometime in the spring of 2021. For the very first flight, the helicopter will take off a few feet from the ground, hover in the air for about 20 to 30 seconds, and land. That will be a major milestone: the very first powered flight in the extremely thin atmosphere of Mars! After that, the team will attempt additional experimental flights of incrementally farther distance and greater altitude.” (Website NASA) After the drone has completed its technology demonstration, the rover will continue its scientific mission. Manned and unmanned flights to Mars will bring us several innovations, including novel chatbots and voicebots.

Four-Legged Robots to Scout Factories

Ford experiments with four-legged robots, to scout factories. The aim is to save time and money. The Ford Media Center presented the procedure on 26 July 2020 as follows: “Ford is tapping four-legged robots at its Van Dyke Transmission Plant in early August to laser scan the plant, helping engineers update the original computer-aided design which is used when we are getting ready to retool our plants. These robots can be deployed into tough-to-reach areas within the plant to scan the area with laser scanners and high-definition cameras, collecting data used to retool plants, saving Ford engineers time and money. Ford is leasing two robots, nicknamed Fluffy and Spot, from Boston Dynamics – a company known for building sophisticated mobile robots.” (Website Ford Media Center) Typically, service robots (e.g., transport robots like Relay) scan buildings to create 2D or 3D models that help them navigate through the rooms. Shuttles use lidar systems to create live 3D models of the environment, to detect obstacles. The robots from Boston Dynamics are also mobile, and that is their great advantage (photo: Ford). Nothing can escape them, nothing can hide from them. Probably the benefit can be increased by including cameras in the building, i.e. using robot2x communication.

Who can be My Companion?

Science-fiction regularly portrays deep friendship or even romantic relationships between a human and a machine, e.g., Dolores and William (Westworld, 2016), Joi and K (Bladerunner 2049, 2017) or Poe and Takeshi Kovacs (Altered Carbon, 2018 and 2020). In fact, for several years now, there has been a development approach aiming to create artificial companions. The so-called companion paradigm focuses on social abilities to create adaptive systems that adapt their behavior to the user and his/her environment [1]. This development approach creates a high degree of individualization and customization. The paradigm principally intends to reduce the complexity of innovative technology, which can go together with a lack of user-friendliness and frustrated users [2]. Ulm University hosted in 2015 the International Symposium of Companion Technology (ISCT). The ISCT conference paper gives a broad overview of the research issues in this field [3]. Some of the discussed research questions approached data input modalities for recognizing emotional states and the user’s current situation or dialog strategies of the artificial companions in order to create a trustworthy relationship. Although the paradigm is already approached quite interdisciplinary, Prof. Hepp (2020) has recently called on communication and media scientists to participate more influential in these discussions. Since in particular, the human-machine-communication was explored lacking pronounced participation of communication scholars [4]. In terms of perception, thrilling issues could consider e.g., possible gradations among different companion systems, and what effects these have on the interaction and communication with the technology? Such questions have to be discussed not only by computer scientists but also by psychology and philosophy scholars. Especially when it comes to the question of how human-machine-relationship will develop in the long run? Will companion systems drift into unemotional and function-centric routines as we have with other technologies, or can they become our forever friends?


References

[1] Wahl M., Krüger S., Frommer J. (2015). Well-intended, but not Well Perceived: Anger and Shame in Reaction to an Affect-oriented Intervention Applied in User-Companion Interaction. In: Biundo-Stephan S., Wendemuth A., & Rukzio E. (Eds.). (2015). Proceedings of the 1st International Symposium on Companion-Technology (ISCT 2015)—September 23rd-25th, Ulm University, Germany. p. 114-119. https://doi.org/10.18725/OPARU-3252

[2] Biundo S., Höller D., Schattenberg B., & Bercher P. (2016). Companion-Technology: An Overview. KI – Künstliche Intelligenz, 30(1), 11–20. https://doi.org/10.1007/s13218-015-0419-3

[3] Biundo-Stephan S., Wendemuth A., & Rukzio E. (Eds.). (2015). Proceedings of the 1st International Symposium on Companion-Technology (ISCT 2015)—September 23rd-25th, Ulm University, Germany. https://doi.org/10.18725/OPARU-3252

[4] Hepp A. (2020). Artificial companions, social bots and work bots: Communicative robots as research objects of media and communication studies. Media, Culture & Society, 016344372091641. https://doi.org/10.1177/0163443720916412