On 24 October 2020 the article „Love Dolls and Sex Robots in Unproven and Unexplored Fields of Application“ by Oliver Bendel was published in Paladyn, Journal of Behavioral Robotics. From the Abstract: „Love dolls, the successors of blow-up dolls, are widespread. They can be ordered online or bought in sex shops and can be found in brothels and households. Sex robots are also on the rise. Research, however, has been slow to address this topic thoroughly. Often, it does not differentiate between users and areas of application, remaining vague, especially in the humanities and social sciences. The present contribution deals with the idea and history of love dolls and sex robots. Against this background, it identifies areas of application that have not been investigated or have hardly been investigated at all. These include prisons, the military, monasteries and seminaries, science, art and design as well as the gamer scene. There is, at least, some relevant research about the application of these artefacts in nursing and retirement homes and as such, these will be given priority. The use of love dolls and sex robots in all these fields is outlined, special features are discussed, and initial ethical, legal and pragmatic considerations are made. It becomes clear that artificial love servants can create added value, but that their use must be carefully considered and prepared. In some cases, their use may even be counterproductive.“ The article is available here for free as an open access publication.
Social robots and service robots usually have a defined locomotor system, a defined appearance and defined mimic and gestural abilities. This leads, on the one hand, to a certain familiarization effect. On the other hand, the actions of the robots are thus limited, for example in the household or in a shopping mall. Robot enhancement is used to extend and improve social robots and service robots. It changes their appearance and expands their scope. It is possible to apply attachments to the hardware, extend limbs and exchange components. One can pull skin made of silicone over the face or head, making the robots look humanoid. One can also change the software and connect the robot to AI systems – this is already done many times. The project or thesis, announced by Oliver Bendel in August 2020 at the School of Business FHNW, should first present the principles and functional possibilities of robot enhancement. Second, concrete examples should be given and described. One of these examples, e.g., the skin made of silicone, has to be implemented. Robots like Pepper or Atlas would be completely changed by such a skin. They could look uncanny, but also appealing. The project will start in September 2020.
The paper „Co-Robots as Care Robots“ by Oliver Bendel, Alina Gasser and Joel Siebenmann was accepted at the AAAI 2020 Spring Symposia. From the abstract: „Cooperation and collaboration robots, co-robots or cobots for short, are an integral part of factories. For example, they work closely with the fitters in the automotive sector, and everyone does what they do best. However, the novel robots are not only relevant in production and logistics, but also in the service sector, especially where proximity between them and the users is desired or unavoidable. For decades, individual solutions of a very different kind have been developed in care. Now experts are increasingly relying on co-robots and teaching them the special tasks that are involved in care or therapy. This article presents the advantages, but also the disadvantages of co-robots in care and support, and provides information with regard to human-robot interaction and communication. The article is based on a model that has already been tested in various nursing and retirement homes, namely Lio from F&P Robotics, and uses results from accompanying studies. The authors can show that co-robots are ideal for care and support in many ways. Of course, it is also important to consider a few points in order to guarantee functionality and acceptance.“ The paper had been submitted to the symposium „Applied AI in Healthcare: Safety, Community, and the Environment“. Oliver Bendel will present the results at Stanford University between 23 and 25 March 2020.
Abb.: Ob mit oder ohne Roboter – der Teddy darf nicht fehlen
The paper „Care Robots with Sexual Assistance Functions“ by Oliver Bendel was accepted at the AAAI 2020 Spring Symposia. From the abstract: „Residents in retirement and nursing homes have sexual needs just like other people. However, the semi-public situation makes it difficult for them to satisfy these existential concerns. In addition, they may not be able to meet a suitable partner or find it difficult to have a relationship for mental or physical reasons. People who live or are cared for at home can also be affected by this problem. Perhaps they can host someone more easily and discreetly than the residents of a health facility, but some elderly and disabled people may be restricted in some ways. This article examines the opportunities and risks that arise with regard to care robots with sexual assistance functions. First of all, it deals with sexual well-being. Then it presents robotic systems ranging from sex robots to care robots. Finally, the focus is on care robots, with the author exploring technical and design issues. A brief ethical discussion completes the article. The result is that care robots with sexual assistance functions could be an enrichment of the everyday life of people in need of care, but that we also have to consider some technical, design and moral aspects.“ The paper had been submitted to the symposium „Applied AI in Healthcare: Safety, Community, and the Environment“. Oliver Bendel will present the paper at Stanford University between 23 and 25 March 2020.
Fig.: Should care robots have sexual assistance functions?
Automation is advancing relentlessly. Already decades ago, digitization was its partner. In the industry, innovative robots, for example co-robots, are used. Service robots begin to spread in various areas. Systems of artificial intelligence perform tasks of all sorts, even creative activities. The studies on the development of the labor market reach different results. In any case, it can be said that certain jobs will disappear and many people will have to do without their familiar work. It can also be assumed that in many areas less human work has to be performed on behalf (e.g., for customers and employers). As possible solutions to economic and social problems, an and a robot tax are suggested. The paper „Are Robot Tax, Basic Income or Basic Property Solutions to the Social Problems of Automation?“ by Oliver Bendel presents, discusses and criticizes these approaches in the context of automation and digitization. Moreover, it develops a relatively unknown proposal, unconditional basic property, and presents its potentials as well as its risks. The lecture took place on 26 March 2019 at Stanford University (AAAI Spring Symposium „Interpretable AI for Well-Being: Understanding Cognitive Bias and Social Embeddedness“) and led to lively discussions. It was nominated for the „best presentation“. The paper has now been published as a preprint and can be downloaded here.
„Die britische Elite-Universität Oxford hat eine Spende in Höhe von 150 Millionen Pfund (rund 168 Millionen Euro) von US-Milliardär Stephen A. Schwarzman erhalten. Mit der höchsten Einzelspende in der Geschichte der Hochschule soll das ‚Stephen A. Schwarzman Centre‘ für Geisteswissenschaften entstehen.“ (SPON, 19. Juni 2019) Dies meldete der Spiegel am 19. Juni 2019. Weiter heißt es: „In dem Gebäude sollen unter anderem die Fakultäten für … Geschichts- und Sprachwissenschaften, Philosophie, Musik und Theologie zusammengelegt werden. Rund ein Viertel aller Oxford-Studenten sind in diesen Fächern eingeschrieben. Zusätzlich soll dort ein neues Institut für Ethik im Umgang mit Künstlicher Intelligenz entstehen, wie die Universität mitteilte.“ (SPON, 19. Juni 2019) Der Schwerpunkt scheint auf Informations- und Roboterethik zu liegen. Schwarzman selbst sagte laut Spiegel, Universitäten müssten dabei helfen, ethische Grundsätze für den schnellen technologischen Wandel zu entwickeln. Über die Herkunft der Mittel wird debattiert. Weitere Informationen über www.spiegel.de/lebenundlernen/uni/oxford-elite-uni-erhaelt-150-millionen-pfund-spende-a-1273161.html.
Dr. Mathilde Noual (Freie Universität Berlin) and Prof. Dr. Oliver Bendel (School of Business FHNW) are organizing a workshop on the social implications of artificial intelligence (AI) and robotics. They are looking for constructive proposals of technological and conceptual utopias, of counter-cultures and counter-systems offering strategies for preserving privacy, individuality, and freedom in a technological world, for going beyond the AI’s present limitations and frustrations, and for emphasising the beauty of the world and of humans’ way of accessing it (with high degree of nuance, contextuality, subjectivity, adaptability and acutality). The tracks are: The territory today: core limitations and prospects of AI; technologies and approaches against surveillance technologies (examples: the virtual burka; the hacked social credit system); technologies and approaches for an intact environment (examples: AI and robots for clean waters and seas; animals with weapons for self-defence); technologies and approaches for a new policy (examples: AI as a president); technologies and approaches for shared knowledge and education (example: open research solutions). The workshop will take place on the 29th and 30th of June 2019 in Berlin, at the Weizenbaum Institute. The CfP is addressed exclusively to the invited persons.
At the end of 2018, the article entitled „Learning How to Behave: Moral Competence for Social Robots“ by Bertram F. Malle and Matthias Scheutz was published in the „Handbuch Maschinenethik“ („Handbook Machine Ethics“) (ed.: Oliver Bendel). An excerpt from the abstract: „We describe a theoretical framework and recent research on one key aspect of robot ethics: the development and implementation of a robot’s moral competence.“ The authors propose „that moral competence consists of five elements, two constituents (moral norms and moral vocabulary) and three activities (moral judgment, moral action, and moral communication)“. „A robot’s computational representations of social and moral norms is a prerequisite for all three moral activities. However, merely programming in advance the vast network of human norms is impossible, so new computational learning algorithms are needed that allow robots to acquire and update the context-specific and graded norms relevant to their domain of deployment. Moral vocabulary is needed primarily for moral communication, which expresses moral judgments of others’ violations and explains one’s own moral violations – to justify them, apologize, or declare intentions to do better. Current robots have at best rudimentary moral competence, but with improved learning and reasoning they may begin to show the kinds of capacities that humans will expect of future social robots.“ (Abstract) An overview of the contributions that have been published electronically since 2017 can be found on link.springer.com/referencework/10.1007/978-3-658-17484-2.
Robots in the health sector are important, valuable innovations and supplements. As therapy and nursing robots, they take care of us and come close to us. In addition, other service robots are widespread in nursing and retirement homes and hospitals. With the help of their sensors, all of them are able to recognize us, to examine and classify us, and to evaluate our behavior and appearance. Some of these robots will pass on our personal data to humans and machines. They invade our privacy and challenge the informational autonomy. This is a problem for the institutions and the people that needs to be solved. The article „The Spy who Loved and Nursed Me: Robots and AI Systems in Healthcare from the Perspective of Information Ethics“ by Oliver Bendel presents robot types in the health sector, along with their technical possibilities, including their sensors and their artificial intelligence capabilities. Against this background, moral problems are discussed, especially from the perspective of information ethics and with respect to privacy and informational autonomy. One of the results shows that such robots can improve the personal autonomy, but the informational autonomy is endangered in an area where privacy has a special importance. At the end, solutions are proposed from various disciplines and perspectives. The article was published in Telepolis on December 17, 2018 and can be accessed via www.heise.de/tp/features/The-Spy-who-Loved-and-Nursed-Me-4251919.html.
„With a few decades, autonomous and semi-autonomous machines will be found throughout Earth’s environments, from homes and gardens to parks and farms and so-called working landscapes – everywhere, really, that humans are found, and perhaps even places we’re not. And while much attention is given to how those machines will interact with people, far less is paid to their impacts on animals.“ (Anthropocene, October 10, 2018) „Machines can disturb, frighten, injure, and kill animals,“ says Oliver Bendel, an information systems professor at the University of Applied Sciences and Arts Northwestern Switzerland, according to the magazine. „Animal-friendly machines are needed.“ (Anthropocene, October 10, 2018) In the article „Will smart machines be kind to animals?“ the magazine Anthropocene deals with animal-friendly machines and introduces the work of the scientist. It is based on his paper „Towards animal-friendly machines“ (Paladyn) and an interview conducted by journalist Brandon Keim with Oliver Bendel. More via www.anthropocenemagazine.org/2018/10/animal-friendly-ai/.
The Gatebox was given to some persons and institutions in Japan some time ago. The company announced at the end of July 2018 that it is now going into series production. In fact, it is possible to order on the website the machine that resembles a coffee machine. The anime girl Azuma Hikari lives in a glass „coffee pot“. She is a hologram connected to a dialogue system and an AI system. She communicates with her owner even when he is out and about (by sending messages to his smartphone) and learns. SRF visited a young man who lives with the Gatebox. „I love my wife,“ Kondo Akihiko is quoted. The station writes: „He can’t hug or kiss her. The Japanese guy is with a hologram.“ (SRF) Anyone who thinks that the love for manga and anime girls is a purely Japanese phenomenon is mistaken. In Dortmund’s BorDoll (from „Bordell“ and „Doll“ or „Love Doll“) the corresponding love dolls are in high demand. Here, too, it is young men shy of real girls who have developed a desire in the tradition of Pygmalion. Akihiko Kondo dreams that one day he can go out into the world with Azuma Hikari and hold her hand. But it’s a long way to go, and the anime girl will still need her little prison for a long time.
The young discipline of machine ethics refers to the morality of semi-autonomous and autonomous machines, robots, bots or software systems. They become special moral agents, and depending on their behavior, we can call them moral or immoral machines. They decide and act in situations where they are left to their own devices, either by following pre-defined rules or by comparing their current situations to case models, or as machines capable of learning and deriving rules. Moral machines have been known for some years, at least as simulations and prototypes. Machine ethics works closely with artificial intelligence and robotics. The term of machine morality can be used similarly to the term of artificial intelligence. Oliver Bendel has developed a graphic that illustrates the relationship between machine ethics and artificial intelligence. He presented it at conferences at Stanford University (AAAI Spring Symposia), in Fort Lauderdale (ISAIM) and Vienna (Robophilosophy) in 2018.
Fig.: The terms of machine ethics and artificial intelligence