Some universities strive to use holograms in their teaching. Through this technology, the lecturer’s representative would have a physical presence in space. Even interactions and conversations would be possible if the holograms or projections were connected to speech systems. Dr. David Lefevre, director of Imperial’s Edtech Lab, told the BBC one year ago: “The alternative is to use video-conferencing software but we believe these holograms have a much greater sense of presence”. American Samoa Community College (ASCC) has now switched on a digital platform that will stream 3D holograms of University of Hawaiʻi faculty members to deliver classes and engage with ASCC students in real-time. According to the website, students at the HoloCampus launch on August 20 received a lecture by UH Mānoa Water Resources Research Center researcher Chris Shuler on the subject of “sustainability and resilience” – a theme “with special significance for the people of American Samoa and Pacific Islands nations as they face challenges such as increasing plastic waste and more dramatic weather systems brought about by climate change” (Website University of Hawaiʻi). Holograms could play a role in all sorts of areas, including social and sexual relationships.
Security technologies are spreading more and more. Some of them, such as the security robot K5, guarantee and destroy security at the same time. Mass shootings such as those in Dayton and El Paso are a particular problem. New tech firms like Athena are offering solutions, as Fast Company reports. “Athena Security uses object-motion detection to spot when an individual brandishes a fireman, and immediately send an alert to their client, whether that’s a private security firm or local law enforcement. The company’s AI object-motion detection is camera agnostic, meaning it can work on any CCTV system. When a gun is detected, the video feed of the active shooter is made available to the client both on mobile devices and desktop computers, allowing officers to know what they are dealing with and where it is happening, all in the space of three seconds …” (Fast Company, 23 August 2019) In fact, technologies are often the only means against technologies. They may also be successful in preventing mass shootings. Another possibility would be to disarm the population – but this would meet with resistance in the USA. Another problem is that this is surveillance technology. Therefore, as with the K5 and other service robots, one thing applies: one form of security is gained, another form of security is lost.
Chimeras in the biological and medical sense are organisms that consist of cells or tissues of different individuals and yet form closed and viable (not necessarily reproductive) organisms. They can be located within a species or between species and can be both plants and animals. There are natural (blood chimeras in mammals) and artificial mixed organisms (grafting in plants, animal-human embryos). Cyborgs are not chimeras in this sense. Nevertheless, research in this field might also be relevant for them, in particular for inverted or reversed cyborgs, for example robots in which an animal or human brain or organ is implanted. Animal-human chimeras for the production of human organs are regarded as unproblematic by many ethicists. According to a comment by Oliver Bendel, this is astonishing, since findings from animal ethics and veterinary medicine and in particular suffering and death of non-human living beings are ignored.
In the recent years, there has been a widespread media coverage of the arrival of the sex robots (e.g., Harmony & Henry, produced by Realbotix™, 2019). This has created heated discussions around the pros and cons of introducing sex robots into human relationships. Nevertheless, it has also served to draw attention to far-reaching social and ethical challenges that will be imposed on us as users of this new technology. A recent Nature editorial (entitled AI LOVE YOU, Nature 547, 138, July 2017) called for urgent empirical research so that empirical evidences can be used to inform robotics design and guide public ethical debates. In response to this urgent need for empirical research in this field, Yuefang Zhou co-organized the first international workshop (AI Love You, 2017) on the theme of human-robot intimate relationships. The workshop brought together an interdisciplinary team, including psychologists, philosophers, computer scientists, ethicists, clinicians, as well as interested members of the general public to discuss this emerging topic. The newly released book (“AI Love You: Developments in Human-Robot Intimate Relationships”, 2019, www.springer.com/gp/book/9783030197339) builds on the presentations and discussions at the workshop to answer the questions of readiness from the perspectives of both technology and humans as users of the technology.
Amazon Rekognition is a well-known software for facial recognition, including emotion detection. It is used in the BESTBOT, a moral machine that hides an immoral machine. The immoral is precisely caused by facial recognition, which endangers the privacy of the user and his or her informational autonomy. The project is intended not least to draw attention to this risk. Amazon announced on 12 August 2019 that it has improved and expanded its system: “Today, we are launching accuracy and functionality improvements to our face analysis features. Face analysis generates metadata about detected faces in the form of gender, age range, emotions, attributes such as ‘Smile’, face pose, face image quality and face landmarks. With this release, we have further improved the accuracy of gender identification. In addition, we have improved accuracy for emotion detection (for all 7 emotions: ‘Happy’, ‘Sad’, ‘Angry’, ‘Surprised’, ‘Disgusted’, ‘Calm’ and ‘Confused’) and added a new emotion: ‘Fear’.” (Amazon, 12 August 2019) Because the BESTBOT accesses other systems such as MS Face API and Kairos, it can already recognize fear. So the change at Amazon means no change for this artifact of machine ethics.
The papers of the CHI 2019 workshop “Conversational Agents: Acting on the Wave of Research and Development” (Glasgow, 5 May 2019) are now listed on convagents.org. The extended abstract by Oliver Bendel (School of Business FHNW) entitled “Chatbots as Moral and Immoral Machines” can be downloaded here. The workshop brought together experts from all over the world who are working on the basics of chatbots and voicebots and are implementing them in different ways. Companies such as Microsoft, Mozilla and Salesforce were also present. Approximately 40 extended abstracts were submitted. On 6 May, a bagpipe player opened the four-day conference following the 35 workshops. Dr. Aleks Krotoski, Pillowfort Productions, gave the first keynote. One of the paper sessions in the morning was dedicated to the topic “Values and Design”. All in all, both classical specific fields of applied ethics and the young discipline of machine ethics were represented at the conference. More information via chi2019.acm.org.