A Fish-inspired Robotic Swarm

A team from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) and the Wyss Institute for Biologically Inspired Engineering has developed fish-inspired robots that can synchronize their movements like a real school of fish, without any external control. According to a SEAS press release, it is the first time scientists have demonstrated complex 3D collective behaviors with implicit coordination in underwater robots. “Robots are often deployed in areas that are inaccessible or dangerous to humans, areas where human intervention might not even be possible”, said Florian Berlinger, a PhD Candidate at SEAS and Wyss in an interview. “In these situations, it really benefits you to have a highly autonomous robot swarm that is self-sufficient.” (SEAS, 13 January 2021) The fish-inspired robotic swarm, dubbed Blueswarm, was created in the lab of Prof. Radhika Nagpal, an expert in self-organizing systems. There are several studies and prototypes in the field of robotic fishs, from CLEANINGFISH (School of Business FHNW) to an invention by Cornell University in New York.

Young Girls could Kill Autonomous Driving

On behalf of Prof. Dr. Oliver Bendel, M. Hashem Birahjakli investigated possible attacks on self-driving cars as part of his final thesis in 2020. The supervisor was Safak Korkut. In the chapter “Attacking Scenarios on Sensors” the student divided into invasive attacks and non-invasive attacks. In the section on invasive attacks he dealt with different sensors and examined possible attacks based on scenarios: vision-based cameras (chewing gum, lipstick, and nail polish; spraying paint; transparent colored foil; concave lenses), radar (chaff, countermeasure), lidar (mirror and reflective objects; dust; face powder), inertial measuring unit (magnet), and sonar (carrot and stick; duct tape). In the section on non-invasive attacks he dealt with fake traffic signs, invisible or fake obstacles, and roadside attacks. The results of the work suggest that every 14-year-old girl could disable a self-driving car. So far, hacking has been seen as the greatest threat to autonomous driving. But while not everyone can hack, almost everyone carries chewing gum or lipstick. The automotive industry should consider this threat seriously.

The MOML Project

In many cases it is important that an autonomous system acts and reacts adequately from a moral point of view. There are some artifacts of machine ethics, e.g., GOODBOT or LADYBIRD by Oliver Bendel or Nao as a care robot by Susan Leigh and Michael Anderson. But there is no standardization in the field of moral machines yet. The MOML project, initiated by Oliver Bendel, is trying to work in this direction. In the management summary of his bachelor thesis Simon Giller writes: “We present a literature review in the areas of machine ethics and markup languages which shaped the proposed morality markup language (MOML). To overcome the most substantial problem of varying moral concepts, MOML uses the idea of the morality menu. The menu lets humans define moral rules and transfer them to an autonomous system to create a proxy morality. Analysing MOML excerpts allowed us to develop an XML schema which we then tested in a test scenario. The outcome is an XML based morality markup language for autonomous agents. Future projects can use this language or extend it. Using the schema, anyone can write MOML documents and validate them. Finally, we discuss new opportunities, applications and concerns related to the use of MOML. Future work could develop a controlled vocabulary or an ontology defining terms and commands for MOML.” The bachelor thesis will be publicly available in autumn 2020. It was supervised by Dr. Elzbieta Pustulka. There will also be a paper with the results next year.

Ingenuity on Mars

The Perseverance rover, which is on its way to Mars, is carrying a drone called Ingenuity (photo/concept: NASA). According to NASA, it is a technology demonstration to test powered flight on another world for the first time. “A series of flight tests will be performed over a 30-Martian-day experimental window that will begin sometime in the spring of 2021. For the very first flight, the helicopter will take off a few feet from the ground, hover in the air for about 20 to 30 seconds, and land. That will be a major milestone: the very first powered flight in the extremely thin atmosphere of Mars! After that, the team will attempt additional experimental flights of incrementally farther distance and greater altitude.” (Website NASA) After the drone has completed its technology demonstration, the rover will continue its scientific mission. Manned and unmanned flights to Mars will bring us several innovations, including novel chatbots and voicebots.

Four-Legged Robots to Scout Factories

Ford experiments with four-legged robots, to scout factories. The aim is to save time and money. The Ford Media Center presented the procedure on 26 July 2020 as follows: “Ford is tapping four-legged robots at its Van Dyke Transmission Plant in early August to laser scan the plant, helping engineers update the original computer-aided design which is used when we are getting ready to retool our plants. These robots can be deployed into tough-to-reach areas within the plant to scan the area with laser scanners and high-definition cameras, collecting data used to retool plants, saving Ford engineers time and money. Ford is leasing two robots, nicknamed Fluffy and Spot, from Boston Dynamics – a company known for building sophisticated mobile robots.” (Website Ford Media Center) Typically, service robots (e.g., transport robots like Relay) scan buildings to create 2D or 3D models that help them navigate through the rooms. Shuttles use lidar systems to create live 3D models of the environment, to detect obstacles. The robots from Boston Dynamics are also mobile, and that is their great advantage (photo: Ford). Nothing can escape them, nothing can hide from them. Probably the benefit can be increased by including cameras in the building, i.e. using robot2x communication.

Show Me Your Hands

Fujitsu has developed an artificial intelligence system that could ensure healthcare, hotel and food industry workers scrub their hands properly. This could support the fight against the COVID-19 pandemic. “The AI, which can recognize complex hand movements and can even detect when people aren’t using soap, was under development before the coronavirus outbreak for Japanese companies implementing stricter hygiene regulations … It is based on crime surveillance technology that can detect suspicious body movements.” (Reuters, 19 June 2020) Genta Suzuki, a senior researcher at the Japanese information technology company, told the news agency that the AI can’t identify people from their hands, but it could be coupled with identity recognition technology so companies could keep track of employees’ washing habits. Maybe in the future it won’t be our parents who will show us how to wash ourselves properly, but robots and AI systems. Or they save themselves this detour and clean us directly.

Robot Performs COVID-19 Tests

The COVID-19 pandemic has given a boost to service robotics. Transport, safety and care robots are in demand, as are cleaning and disinfection robots. Service robots measure the temperature of passengers at airports and railway stations. Now they can also perform COVID-19 tests.  “Robotics researchers from the University of Southern Denmark have developed the world’s first fully automatic robot capable of carrying out throat swabs for Covid-19, so that healthcare professionals are not exposed to the risk of infection. The prototype has successfully performed throat swabs on several people. The scientists behind are cheering: The technology works!” (Website SDU, 27 May 2020) A robot arm as known from the industry was used. The end piece comes from the 3D printer. This is another example from the health sector that shows how industrial robots – such as cobots – can become service robots. More information via www.sdu.dk/en/nyheder/forskningsnyheder/robot-kan-pode-patienter-for-covid-19.

The BESTBOT on Mars

Living, working, and sleeping in small spaces next to the same people for months or years would be stressful for even the fittest and toughest astronauts. Neel V. Patel underlines this fact in a current article for MIT Technology Review. If they are near Earth, they can talk to psychologists. But if they are far away, it will be difficult. Moreover, in the future there could be astronauts in space whose clients cannot afford human psychological support. “An AI assistant that’s able to intuit human emotion and respond with empathy could be exactly what’s needed, particularly on future missions to Mars and beyond. The idea is that it could anticipate the needs of the crew and intervene if their mental health seems at risk.” (MIT Technology Review, 14 January 2020) NASA wants to develop such an assistant together with the Australian tech firm Akin. They could build on research by Oliver Bendel. Together with his teams, he has developed the GOODBOT in 2013 and the BESTBOT in 2018. Both can detect users’ problems and react adequately to them. The more recent chatbot even has face recognition in combination with emotion recognition. If it detects discrepancies with what the user has said or written, it will make this a subject of discussion. The BESTBOT on Mars – it would like that.

HTML, SSML, AIML – and MOML?

On behalf of Prof. Dr. Oliver Bendel, a student at the School of Business FHNW, Alessandro Spadola, investigated in the context of machine ethics whether markup languages such as HTML, SSML and AIML can be used to transfer moral aspects to machines or websites and whether there is room for a new language that could be called Morality Markup Language (MOML). He presented his results in January 2020. From the management summary: “However, the idea that owners should be able to transmit their own personal morality has been explored by Bendel, who has proposed an open way of transferring morality to machines using a markup language. This research paper analyses whether a new markup language could be used to imbue machines with their owners’ sense of morality. This work begins with an analysis how a markup language is structured, describes the current well-known markup languages and analyses their differences. In doing so, it reveals that the main difference between the well-known markup languages lies in the different goals they pursue which at the same time forms the subject, which is marked up. This thesis then examines the possibility of transferring personal morality with the current languages available and discusses whether there is a need for a further language for this purpose. As is shown, morality can only be transmitted with increased effort and the knowledge of human perception because it is only possible to transmit them by interacting with the senses of the people. The answer to the question of whether there is room for another markup language is ‘yes’, since none of the languages analysed offer a simple way to transmit morality, and simplicity is a key factor in markup languages. Markup languages all have clear goals, but none have the goal of transferring and displaying morality. The language that could assume this task is ‘Morality Markup’, and the present work describes how such a language might look.” (Management Summary) The promising results are to be continued in the course of the year by another student in a bachelor thesis.

The Future of Autonomous Driving

Driving in cities is a very complex matter. There are several reasons for this: You have to judge hundreds of objects and events at all times. You have to communicate with people. And you should be able to change decisions spontaneously, for example because you remember that you have to buy something. That’s a bad prospect for an autonomous car. Of course it can do some tricks: It can drive very slowly. It can use virtual tracks or special lanes and signals and sounds. A bus or shuttle is able to use such tricks. But hardly a car. Autonomous individual transport in cities will only be possible if the cities are redesigned. This has been done a few decades ago. And it wasn’t a good idea at all. So don’t let autonomous cars drive in the cities, but let them drive on the highways. Should autonomous cars make moral decisions about the lives and deaths of pedestrians and cyclists? They should better not. Moral machines are a valuable innovation in certain contexts. But not in the traffic of cities. Pedestrians and cyclists rarely get onto the highway. There are many reasons why we should allow autonomous cars only there.