In January 2023, the Proceedings of Robophilosophy 2022 were published. Included is the paper “Robots in Policing” by Oliver Bendel. From the abstract: “This article is devoted to the question of how robots are used in policing and what opportunities and risks arise in social terms. It begins by briefly explaining the characteristics of modern police work. It puts service robots and social robots in relation to each other and outlines relevant disciplines. The article also lists types of robots that are and could be relevant in the present context. It then gives examples from different countries of the use of robots in police work and security services. From these, it derives the central tasks of robots in this area and their most important technical features. A discussion from social, ethical, and technical perspectives seeks to provide clarity on how robots are changing the police as a social institution and with social actions and relationships, and what challenges need to be addressed.” (Abstract) Robots in policing is a topic that has not received much attention. However, it is likely to become considerably more topical in the next few years. More information about the conference on cas.au.dk/en/robophilosophy/conferences/rpc2022 (Photo: Anna Jarske-Fransas).
A Robot Charging Station for the Disabled
According to a media release, Ford has developed a prototype robot charging station that drivers operate via their smartphone from inside their electric vehicle. The technology could enable disabled persons to stay in the Ford (or another car) while charging, or they could leave the vehicle while the robot completes the task. Disabled drivers, the company says, have already identified ease of charging as a key purchase consideration for electric vehicles. The robot charging station is tested as part of a research project to develop hands-free charging solutions for electric vehicles and fully automatic charging for autonomous vehicles. “Following initial lab testing, Ford researchers are now putting the robot charging station to the test in real-life situations. Once activated, the station cover slides open and the charging arm extends towards the inlet with the help of a tiny camera. For the trial, drivers were able to monitor the charge status via the FordPass app. After charging, the arm retracts back into place.” (Ford Media Release, July 20, 2022) More information is available via the Ford Media Center.
Robots Dancing Like Bees
Robot-robot communication and interaction usually takes place via networks. Spoken language can also be used. In certain situations, however, these methods reach their limits. For example, during a rescue operation in disaster areas, communication via radio signals might not be possible. With this in mind, Kaustubh Joshi from the University of Maryland and Abhra Roy Chowdhury from the Indian Institute of Science (IISc) have developed an alternative approach. Their paper states: “This research presents a novel bio-inspired framework for two robots interacting together for a cooperative package delivery task with a human-in the-loop. It contributes to eliminating the need for network-based robot-robot interaction in constrained environments. An individual robot is instructed to move in specific shapes with a particular orientation at a certain speed for the other robot to infer using object detection (custom YOLOv4) and depth perception. The shape is identified by calculating the area occupied by the detected polygonal route. A metric for the area’s extent is calculated and empirically used to assign regions for specific shapes and gives an overall accuracy of 93.3% in simulations and 90% in a physical setup. Additionally, gestures are analyzed for their accuracy of intended direction, distance, and the target coordinates in the map. The system gives an average positional RMSE of 0.349 in simulation and 0.461 in a physical experiment.” (Abstract) The way of interaction and communication is reminiscent of the bee dance – and indeed this served as a model. The paper can be accessed via www.frontiersin.org/articles/10.3389/frobt.2022.915884/full.
Robots and Cops
Robophilosophy 2022 is the fifth event in the biennial Robophilosophy Conference Series. It “will explore the societal significance of social robots for the future of social institutions with its usual broad scope, embracing both theoretical and practical angles” (CfP Robophilosophy). It “is an invitation to philosophers and other SSH researchers, as well as researchers in social robotics and HRI, to investigate from interdisciplinarily informed perspectives whether and how social robotics as an interdisciplinary endeavour can contribute to the ability of our institutions to perform their functions in society” (CfP Robophilosophy). Social institutions include retirement and nursing homes, strip clubs and brothels, monasteries and seminaries, and police departments. As announced by the organizers on April 15, Oliver Bendel (School of Business FHNW) will have the opportunity to present his paper entitled “Robots in Policing” at the conference. It is about how service robots and social robots are changing policing as “social work”. In addition, a poster by Katharina Kühne and Melinda Mende (University of Potsdam) as well als Oliver Bendel entitled “Tamagotchi on our couch: Are social robots perceived as pets?” was accepted.
Will We Love Robots?
“It is estimated that there are now more than 1.7 million robots with social attributes worldwide. They care for, educate, help, and entertain us. There have also long been highly engineered sex robots. But can these machines actually develop feelings – or even feel love?” (Website ARTE, own translation) ARTE asks this question in the series “42 – Die Antwort auf fast alles” (“42 – The Answer to Almost Everything”). The program “Werden wir Roboter lieben?” (“Will we love robots?”) will be broadcast on February 19, 2022. The online version is already available from January 20. Dr. Hooman Samani, a robotics expert at the University of Plymouth, Prof. Dr. Martin Fischer, a cognitive psychologist at the University of Potsdam, and Prof. Dr. Oliver Bendel, an information and machine ethicist at the Hochschule für Wirtschaft FHNW, will have their say. Prof. Dr. Oliver Bendel has been researching conversational agents and social robots for more than 20 years and has published the Springer book “Soziale Roboter” (“Social Robots”) at the end of 2021. More information on the program via www.arte.tv/de/videos/101938-004-A/42-die-antwort-auf-fast-alles/.
Four-Legged Robots to Scout Factories
Ford experiments with four-legged robots, to scout factories. The aim is to save time and money. The Ford Media Center presented the procedure on 26 July 2020 as follows: “Ford is tapping four-legged robots at its Van Dyke Transmission Plant in early August to laser scan the plant, helping engineers update the original computer-aided design which is used when we are getting ready to retool our plants. These robots can be deployed into tough-to-reach areas within the plant to scan the area with laser scanners and high-definition cameras, collecting data used to retool plants, saving Ford engineers time and money. Ford is leasing two robots, nicknamed Fluffy and Spot, from Boston Dynamics – a company known for building sophisticated mobile robots.” (Website Ford Media Center) Typically, service robots (e.g., transport robots like Relay) scan buildings to create 2D or 3D models that help them navigate through the rooms. Shuttles use lidar systems to create live 3D models of the environment, to detect obstacles. The robots from Boston Dynamics are also mobile, and that is their great advantage (photo: Ford). Nothing can escape them, nothing can hide from them. Probably the benefit can be increased by including cameras in the building, i.e. using robot2x communication.
Dogs Obey Social Robots
The field of animal-machine interaction is gaining new research topics with social robots. Meiying Qin from Yale University and her co-authors have brought together a Nao and a dog. From the abstract of their paper: “In two experiments, we investigate whether dogs respond to a social robot after the robot called their names, and whether dogs follow the ‘sit’ commands given by the robot. We conducted a between-subjects study (n = 34) to compare dogs’ reactions to a social robot with a loudspeaker. Results indicate that dogs gazed at the robot more often after the robot called their names than after the loudspeaker called their names. Dogs followed the ‘sit’ commands more often given by the robot than given by the loudspeaker. The contribution of this study is that it is the first study to provide preliminary evidence that 1) dogs showed positive behaviors to social robots and that 2) social robots could influence dog’s behaviors. This study enhance the understanding of the nature of the social interactions between humans and social robots from the evolutionary approach. Possible explanations for the observed behavior might point toward dogs perceiving robots as agents, the embodiment of the robot creating pressure for socialized responses, or the multimodal (i.e., verbal and visual) cues provided by the robot being more attractive than our control condition.” (Abstract) You can read the full paper via dl.acm.org/doi/abs/10.1145/3371382.3380734.
Robots against Plastic Waste
WasteShark is a remotely controlled robot by Ranmarine Technologies that collects plastics from the surface of lakes and oceans. “Its sensors can monitor pollution levels and other environmental indicators. It is electrically powered, emission-free and can collect hundreds of kilos of rubbish at a time.” (Euronews, 15 December 2019) According to Euronews, Richard Hardiman, the founder of the start-up-company, said: “What we’re trying to do is create a small enough vessel that will get into tight spaces where waste collects, particularly in the harbours and the ports, and stop all that waste being taken out into the greater ocean.” (Euronews, 15 December 2019) The project received support from the European funds allocated to making plastic circular. An overview of the most important projects against plastic waste in water can be found here.
Towards Self-replicating Machines
In recent decades, there have been several attempts to supplement traditional electronic storage media. 3D codes with color as the third dimension are an interesting approach. They can be applied to paper or film, for example. Another approach has now been presented by researchers from Switzerland and Israel. They are able to generate artificial DNA and place it in any object. From the Abstract: “We devised a ‘DNA-of-things’ (DoT) storage architecture to produce materials with immutable memory. In a DoT framework, DNA molecules record the data, and these molecules are then encapsulated in nanometer silica beads, which are fused into various materials that are used to print or cast objects in any shape. First, we applied DoT to three-dimensionally print a Stanford Bunny that contained a 45 kB digital DNA blueprint for its synthesis. We synthesized five generations of the bunny, each from the memory of the previous generation without additional DNA synthesis or degradation of information. … DoT could be applied to store electronic health records in medical implants, to hide data in everyday objects (steganography) and to manufacture objects containing their own blueprint. It may also facilitate the development of self-replicating machines.” (Abstract) The approach could also be interesting for robots. They could, for example, reproduce themselves on Mars. The article with the title “A DNA-of-things storage architecture to create materials with embedded memory” has been published in NATURE BIOTECHNOLOGY and can be accessed via www.nature.com/articles/s41587-019-0356-z.epdf.
Robots that Learn as They Go
“Alphabet X, the company’s early research and development division, has unveiled the Everyday Robot project, whose aim is to develop a ‘general-purpose learning robot.’ The idea is to equip robots with cameras and complex machine-learning software, letting them observe the world around them and learn from it without needing to be taught every potential situation they may encounter.” (MIT Technology Review, 23 November 2019) This was reported by MIT Technology Review on 23 November 2019 in the article “Alphabet X’s ‘Everyday Robot’ project is making machines that learn as they go”. The approach of Alphabet X seems to be well though-out and target-oriented. In a way, it is oriented towards human learning. One could also teach robots human language in this way. With the help of microphones, cameras and machine learning, they would gradually understand us better and better. For example, they observe how we point to and comment on a person. Or they perceive that we point to an object and say a certain term – and after some time they conclude that this is the name of the object. However, such frameworks pose ethical and legal challenges. You can’t just designate cities as such test areas. The result would be comprehensive surveillance in public spaces. Specially established test areas, on the other hand, would probably not have the same benefits as “natural environments”. Many questions still need to be answered.