Ford experiments with four-legged robots, to scout factories. The aim is to save time and money. The Ford Media Center presented the procedure on 26 July 2020 as follows: “Ford is tapping four-legged robots at its Van Dyke Transmission Plant in early August to laser scan the plant, helping engineers update the original computer-aided design which is used when we are getting ready to retool our plants. These robots can be deployed into tough-to-reach areas within the plant to scan the area with laser scanners and high-definition cameras, collecting data used to retool plants, saving Ford engineers time and money. Ford is leasing two robots, nicknamed Fluffy and Spot, from Boston Dynamics – a company known for building sophisticated mobile robots.” (Website Ford Media Center) Typically, service robots (e.g., transport robots like Relay) scan buildings to create 2D or 3D models that help them navigate through the rooms. Shuttles use lidar systems to create live 3D models of the environment, to detect obstacles. The robots from Boston Dynamics are also mobile, and that is their great advantage (photo: Ford). Nothing can escape them, nothing can hide from them. Probably the benefit can be increased by including cameras in the building, i.e. using robot2x communication.
The field of animal-machine interaction is gaining new research topics with social robots. Meiying Qin from Yale University and her co-authors have brought together a Nao and a dog. From the abstract of their paper: “In two experiments, we investigate whether dogs respond to a social robot after the robot called their names, and whether dogs follow the ‘sit’ commands given by the robot. We conducted a between-subjects study (n = 34) to compare dogs’ reactions to a social robot with a loudspeaker. Results indicate that dogs gazed at the robot more often after the robot called their names than after the loudspeaker called their names. Dogs followed the ‘sit’ commands more often given by the robot than given by the loudspeaker. The contribution of this study is that it is the first study to provide preliminary evidence that 1) dogs showed positive behaviors to social robots and that 2) social robots could influence dog’s behaviors. This study enhance the understanding of the nature of the social interactions between humans and social robots from the evolutionary approach. Possible explanations for the observed behavior might point toward dogs perceiving robots as agents, the embodiment of the robot creating pressure for socialized responses, or the multimodal (i.e., verbal and visual) cues provided by the robot being more attractive than our control condition.” (Abstract) You can read the full paper via dl.acm.org/doi/abs/10.1145/3371382.3380734.
WasteShark is a remotely controlled robot by Ranmarine Technologies that collects plastics from the surface of lakes and oceans. “Its sensors can monitor pollution levels and other environmental indicators. It is electrically powered, emission-free and can collect hundreds of kilos of rubbish at a time.” (Euronews, 15 December 2019) According to Euronews, Richard Hardiman, the founder of the start-up-company, said: “What we’re trying to do is create a small enough vessel that will get into tight spaces where waste collects, particularly in the harbours and the ports, and stop all that waste being taken out into the greater ocean.” (Euronews, 15 December 2019) The project received support from the European funds allocated to making plastic circular. An overview of the most important projects against plastic waste in water can be found here.
In recent decades, there have been several attempts to supplement traditional electronic storage media. 3D codes with color as the third dimension are an interesting approach. They can be applied to paper or film, for example. Another approach has now been presented by researchers from Switzerland and Israel. They are able to generate artificial DNA and place it in any object. From the Abstract: “We devised a ‘DNA-of-things’ (DoT) storage architecture to produce materials with immutable memory. In a DoT framework, DNA molecules record the data, and these molecules are then encapsulated in nanometer silica beads, which are fused into various materials that are used to print or cast objects in any shape. First, we applied DoT to three-dimensionally print a Stanford Bunny that contained a 45 kB digital DNA blueprint for its synthesis. We synthesized five generations of the bunny, each from the memory of the previous generation without additional DNA synthesis or degradation of information. … DoT could be applied to store electronic health records in medical implants, to hide data in everyday objects (steganography) and to manufacture objects containing their own blueprint. It may also facilitate the development of self-replicating machines.” (Abstract) The approach could also be interesting for robots. They could, for example, reproduce themselves on Mars. The article with the title “A DNA-of-things storage architecture to create materials with embedded memory” has been published in NATURE BIOTECHNOLOGY and can be accessed via www.nature.com/articles/s41587-019-0356-z.epdf.
“Alphabet X, the company’s early research and development division, has unveiled the Everyday Robot project, whose aim is to develop a ‘general-purpose learning robot.’ The idea is to equip robots with cameras and complex machine-learning software, letting them observe the world around them and learn from it without needing to be taught every potential situation they may encounter.” (MIT Technology Review, 23 November 2019) This was reported by MIT Technology Review on 23 November 2019 in the article “Alphabet X’s ‘Everyday Robot’ project is making machines that learn as they go”. The approach of Alphabet X seems to be well though-out and target-oriented. In a way, it is oriented towards human learning. One could also teach robots human language in this way. With the help of microphones, cameras and machine learning, they would gradually understand us better and better. For example, they observe how we point to and comment on a person. Or they perceive that we point to an object and say a certain term – and after some time they conclude that this is the name of the object. However, such frameworks pose ethical and legal challenges. You can’t just designate cities as such test areas. The result would be comprehensive surveillance in public spaces. Specially established test areas, on the other hand, would probably not have the same benefits as “natural environments”. Many questions still need to be answered.
In October 2019 Springer VS published the “Handbuch Maschinenethik” (“Handbook Machine Ethics”) with German and English contributions. Editor is Oliver Bendel (Zurich, Switzerland). One of the articles was written by Bertram F. Malle (Brown University, Rhode Island) and Matthias Scheutz (Tufts University, Massachusetts). From the abstract: “We describe a theoretical framework and recent research on one key aspect of robot ethics: the development and implementation of a robot’s moral competence. As autonomous machines take on increasingly social roles in human communities, these machines need to have some level of moral competence to ensure safety, acceptance, and justified trust. We review the extensive and complex elements of human moral competence and ask how analogous competences could be implemented in a robot. We propose that moral competence consists of five elements, two constituents (moral norms and moral vocabulary) and three activities (moral judgment, moral action, and moral communication). A robot’s computational representations of social and moral norms is a prerequisite for all three moral activities. However, merely programming in advance the vast network of human norms is impossible, so new computational learning algorithms are needed that allow robots to acquire and update the context-specific and graded norms relevant to their domain of deployment. Moral vocabulary is needed primarily for moral communication, which expresses moral judgments of others’ violations and explains one’s own moral violations – to justify them, apologize, or declare intentions to do better. Current robots have at best rudimentary moral competence, but with improved learning and reasoning they may begin to show the kinds of capacities that humans will expect of future social robots.” (Abstract “Handbuch Maschinenethik”). The book is available via www.springer.com.
Hugs are very important to many of us. We are embraced by familiar and strange people. When we hug ourselves, it does not have the same effect. And when a robot hugs us, it has no effect at all – or we don’t feel comfortable. But you can change that a bit. Alexis E. Block and Katherine J. Kuchenbecker from the Max Planck Institute for Intelligent Systems have published a paper on a research project in this field. The purpose of the project was to evaluate human responses to different robot physical characteristics and hugging behaviors. “Analysis of the results showed that people significantly prefer soft, warm hugs over hard, cold hugs. Furthermore, users prefer hugs that physically squeeze them and release immediately when they are ready for the hug to end. Taking part in the experiment also significantly increased positive user opinions of robots and robot use.” (Abstract) The paper “Softness, Warmth, and Responsiveness Improve Robot Hugs” was published in the International Journal of Social Robotics in January 2019 (First Online: 25 October 2018). It is available via link.springer.com/article/10.1007/s12369-018-0495-2.
The Mayflower Autonomous Ship (MAS) could be the first vessel to cross the Atlantic that is able to navigate around obstacles and hazards by itself. It will depart from Plymouth, UK on the fourth centenary of the original Mayflower voyage, on 6 September 2020, and will reach Plymouth, USA after an ecxiting tour, dedicated to science. “The project was put together by marine research and exploration company ProMare in an effort to expand the scope of marine research. The boat will carry three research pods equipped with scientific instruments to measure various phenomena such as ocean plastics, mammal behaviour or sea level changes.” (ZDnet, 16 October 2019) According to ZDnet, IBM has now joined the initiative to supply technical support for all navigation operations. It is important that the futuristic ship is not only careful with things, but also with animals. In this context, insights of animal-machine interaction and machine ethics might be useful. Ultimately, the excursion will help the marine mammals by obtaining data on their behaviour.
On 20 September 2019 FridaysforFuture had called for worldwide climate strikes. Hundreds of thousands of people around the world took to the streets to protest for more sustainable industry and long-term climate policies to fight global warming. Technological progress and the protection of the environment do not necessarily have to contradict. Quite the opposite, we will present 3 robots which show that technology can be used to achieve climate goals. Planting trees is the most efficient strategy to recover biodiversity and stop climate change. However, this method requires lots of human power. The GrowBot automates this task resulting in a planting rate that is 10 times faster than trained human planters. In contrast to planting drones, the little truck-alike robot not only spreads seeds, instead it plants small trees into the soil which gives them a better chance to survive and foster reforestation. The bio-inspired Row-bot converts organic matter into operating power just as the water boatman (a bug). The robot’s engine is based on a microbial fuel cell (MFC) which enables the robot to swim. Researchers from the Bristol Robotics Laboratory developed the 3D-printed Row-bot for environmental clean-up operations such as harmful algal bloom, oil spills or monitoring the impact of (natural or man-made) environmental catastrophes. Next-level-recycling like in the movie WALL·E can be expected with the sorting robot RoCycle which is developed at MIT. Other than classic recycling machines the robot is capable of distinguishing paper, plastic and metal carbage by using pressure sensors. This tactile solution is 85% accurate at detecting in stationary use and 63% when attached to an assembly line. Through cameras and magnets, the researchers aim to optimise recycling to help cleaning Earth.
Robots are repeatedly damaged or destroyed. The hitchBOT is a well-known example. But also the security robot K5 has become a victim of attacks several times. The latest case is described in the magazine Wired: “Every day for 10 months, Knightscope K5 patrolled the parking garage across the street from the city hall in Hayward, California. An autonomous security robot, it rolled around by itself, taking video and reading license plates. Locals had complained the garage was dangerous, but K5 seemed to be doing a good job restoring safety. Until the night of August 3, when a stranger came up to K5, knocked it down, and kicked it repeatedly, inflicting serious damage.” (Wired, 29 August 2019) The author investigates the question of whether one may attack robots. Of course you shouldn’t damage other people’s property. But what if the robot is a spy, a data collector, a profile creator? Digital self-defence (which exploits digital as well as analog possibilities) seems to be a proven tool not only in Hong Kong, but also in the US and Europe. The rights of robots that some demand cannot be a serious problem. Robots do not have rights. They feel nothing, they do not suffer, they have no consciousness. “So punch the robot, I tell you! Test the strength of your sociopolitical convictions on this lunk of inorganic matter!” (Wired, 29 August 2019)