Animals and Robots

The manuscript of the book „Non-Human Animals, Ethics and Engineering“ (alternative title „Animals, Ethics and Engineering“) was sent to the publisher Jenny Stanford in May 2024. It contains 16 chapters on this topic, including by Clara Mancini („Animal-Centered Technology and Sustainable Development“), Fiona French („Designing and Crafting Systems for Non-Human Animals“), and Leonie Bossert together with Thilo Hagendorff („Animals and AI: The Role of Animals in AI Research and Application“). In “An Investigation into the Encounter Between Social Robots and Animals” (Chapter 12), Oliver Bendel “delves into the evolving landscape of social robots designed to interact with animals, dissecting the intricate dynamics of these interactions and their ethical ramifications” (Information from the editors). The philosopher of technology also presents his own projects, such as concepts and prototypes of animal-friendly machines, developed in the context of machine ethics, animal-machine interaction, and social robotics. The editors are Rosalyn W. Berne and Madeline A. Kibler from the University of Virginia. The book is scheduled for publication in late summer or fall 2024.

Fig.: A robot turtle and a woman (Image: Ideogram)

Wildlife Cameras, Drones, and Robots

The ACI took place from 5 to 8 December 2022 in Newcastle upon Tyne. It is the world’s leading conference on animal-computer interaction. The proceedings were published in the ACM Library on March 30, 2023. They include the paper „A Face Recognition System for Bears: Protection for Animals and Humans in the Alps“ by Oliver Bendel and Ali Yürekkirmaz. From the abstract: „Face recognition, in the sense of identifying people, is controversial from a legal, social, and ethical perspective. In particular, opposition has been expressed to its use in public spaces for mass surveillance purposes. Face recognition in animals, by contrast, seems to be uncontroversial from a social and ethical point of view and could even have potential for animal welfare and protection. This paper explores how face recognition for bears (understood here as brown bears) in the Alps could be implemented within a system that would help animals as well as humans. It sets out the advantages and disadvantages of wildlife cameras, ground robots, and camera drones that would be linked to artificial intelligence. Based on this, the authors make a proposal for deployment. They favour a three-stage plan that first deploys fixed cameras and then incorporates camera drones and ground robots. These are all connected to a control centre that assesses images and developments and intervenes as needed. The paper then discusses social and ethical, technical and scientific, and economic and structural perspectives. In conclusion, it considers what could happen in the future in this context.“ The proceedings can be accessed via dl.acm.org/doi/proceedings/10.1145/3565995.

Fig.: A brown bear

When Robots Hug People

From March 27-29, 2023, the AAAI 2023 Spring Symposia will feature the symposium „Socially Responsible AI for Well-being“ by Takashi Kido and Keiki Takadama. The venue is usually Stanford University. For staffing reasons, this year the conference will be held at the Hyatt Regency in San Francisco. On March 28, Prof. Dr. Oliver Bendel will present the paper „Increasing Well-being through Robotic Hugs“. It was written by himself, Andrea Puljic, Robin Heiz, Furkan Tömen, and Ivan De Paola. From the abstract: „This paper addresses the question of how to increase the acceptability of a robot hug and whether such a hug contributes to well-being. It combines the lead author’s own research with pioneering research by Alexis E. Block and Katherine J. Kuchenbecker. First, the basics of this area are laid out with particular attention to the work of the two scientists. The authors then present HUGGIE Project I, which largely consisted of an online survey with nearly 300 participants, followed by HUGGIE Project II, which involved building a hugging robot and testing it on 136 people. At the end, the results are linked to current research by Block and Kuchenbecker, who have equipped their hugging robot with artificial intelligence to better respond to the needs of subjects.“ More information via aaai.org/conference/spring-symposia/sss23/.

Fig.: The HUGGIE-Team (without Oliver Bendel)

Robots and Pets

Katharina Kühne and Melinda A. Jeglinski-Mende (University of Potsdam) together with Oliver Bendel (School of Business FHNW) have written an extended abstract for Robophilosophy 2022 entitled „Tamagotchi on our Couch“. The corresponding poster was presented by Katharina Kühne on on August 16, 2022, the first day of the conference. The poster is made available here. From the abstract: „Although social robots increasingly enter our lives, it is not clear how they are perceived. Previous research indicates that there is a tendency to anthropomorphize social robots, at least in the Western culture. One of the most promising roles of robots in our society is companionship. Pets also fulfill this role, which gives their owners health and wellbeing benefits. In our study, we investigated if social robots can implicitly and explicitly be perceived as pets. In an online experiment, we measured implicit associations between pets and robots using pictures of robots and devices, as well as attributes denoting pet and non-pet features, in a Go/No-Go Association Task (GNAT). Further, we asked our participants to explicitly evaluate to what extent they perceive robots as pets and if robots could replace a real pet. Our findings show that implicitly, but not explicitly, social robots are perceived as pets.“ (Abstract) The poster is available here.

Fig.: Pet without robot

Robots Mingle with Penguins

British Filmmaker John Downer has created artificial monkeys, wolves, hippos, turtles, alligators, etc., to observe appropriate wildlife and obtain spectacular images. His well-known robots are very intricately designed and resemble the animals they mimic in almost every detail. It is not necessary to resort to such technically elaborate and artistically demanding means for all species. USA Today reports in a recent article about a robot called ECHO. „ECHO is a remote-controlled ground robot that silently spies on the emperor penguin colony in Atka Bay. The robot is being monitored by the Single Penguin Observation and Tracking observatory. Both the SPOT observatory, which is also remote-operated through a satellite link, and the ECHO robot capture photographs and videos of animal population in the Arctic.“ (USA Today, May 6, 2022) ECHO does not resemble a penguin in any way. It is a yellow vehicle with four thick wheels. But as a video shows, the animals seem to have gotten used to it. It comes very close to them without scaring them. Wildlife monitoring using robots is becoming increasingly important, and obviously very different types are being considered.

Fig.: Penguins in Antarctica

How Machines and Robots can Support and Save Animals

On 25 February 2022, the article „Passive, Active, and Proactive Systems and Machines for the Protection and Preservation of Animals and Animal Species“ by Oliver Bendel was published in Frontiers in Animal Science. From the abstract: „Digitalization and automation are expanding into many areas, resulting in more widespread use of partially and fully autonomous machines and robots. At the same time, environmental and other crises and disasters are on the rise, the world population is growing, and animals are losing their habitat. Increasingly, machines and robots such as agricultural vehicles, autonomous cars, robotic lawnmowers, or social robots are encountering animals of all kinds. In the process, the latter are injured or killed. Some machines can be designed so that this does not happen. Relevant disciplines and research areas briefly introduced here are machine ethics, social robotics, animal-machine interaction, and animal-computer interaction. In addition, animal welfare is important. Passive and active machines—as they are called in this review—are already appearing and help to observe and protect animals. Proactive machines may play a role in the future. They could use the possibilities of full automation and autonomy to save animals from suffering in agriculture or in the wild. During crises and disasters and in extensive nature reserves, they could observe, care for, and protect animals. The review provides initial considerations on active, passive, and proactive machines and how they can be used in an animal preservation context while bearing in mind recent technical and global developments.“ The article is part of the research topic „Animal-Computer Interaction and Beyond: The Benefits of Animal-Centered Research and Design“ and can be accessed at www.frontiersin.org/articles/10.3389/fanim.2022.834634/full.

Fig.: Machines and robots can support and save animals

Hello Deer, Go Back to the Forest!

We use our natural language, facial expressions and gestures when communicating with our fellow humans. Some of our social robots also have these abilities, and so we can converse with them in the usual way. Many highly evolved animals have a language in which there are sounds and signals that have specific meanings. Some of them – like chimpanzees or gorillas – have mimic and gestural abilities comparable to ours. Britt Selvitelle and Aza Raskin, founders of the Earth Species Project, want to use machine learning to enable communication between humans and animals. Languages, they believe, can be represented not only as geometric structures, but also translated by matching their structures to each other. They say they have started working on whale and dolphin communication. Over time, the focus will broaden to include primates, corvids, and others. It would be important for the two scientists to study not only natural language but also facial expressions, gestures and other movements associated with meaning (they are well aware of this challenge). In addition, there are aspects of animal communication that are inaudible and invisible to humans that would need to be considered. Britt Selvitelle and Aza Raskin believe that translation would open up the world of animals – but it could be the other way around that they would first have to open up the world of animals in order to decode their language. However, should there be breakthroughs in this area, it would be an opportunity for animal welfare. For example, social robots, autonomous cars, wind turbines, and other machines could use animal languages alongside mechanical signals and human commands to instruct, warn and scare away dogs, elks, pigs, and birds. Machine ethics has been developing animal-friendly machines for years. Among other things, the scientists use sensors together with decision trees. Depending on the situation, braking and evasive maneuvers are initiated. Maybe one day the autonomous car will be able to avoid an accident by calling out in deer dialect: Hello deer, go back to the forest!

Fig.: Three fawns

New Tasks for Cobots

After several postponements, the symposium „Applied AI in Healthcare: Safety, Community, and the Environment“ will be held within the AAAI Spring Symposia on March 22-23, 2021. One of the presentations is titled „Co-Robots as Care Robots“ (co-robots are also called cobots). The authors of the paper are Oliver Bendel, Alina Gasser, and Joel Siebenmann. From the abstract: „Cooperation and collaboration robots, co-robots or cobots for short, are an integral part of factories. For example, they work closely with the fitters in the automotive sector, and everyone does what they do best. However, the novel robots are not only relevant in production and logistics, but also in the service sector, especially where proximity between them and the users is desired or unavoidable. For decades, individual solutions of a very different kind have been developed in care. Now experts are increasingly relying on co-robots and teaching them the special tasks that are involved in care or therapy. This article presents the advantages, but also the disadvantages of co-robots in care and support, and provides information with regard to human-robot interaction and communication. The article is based on a model that has already been tested in various nursing and retirement homes, namely Lio from F&P Robotics, and uses results from accompanying studies. The authors can show that co-robots are ideal for care and support in many ways. Of course, it is also important to consider a few points in order to guarantee functionality and acceptance.“ More information about the AAAI Spring Symposia is available at aaai.org/Symposia/Spring/sss21.php.

Fig.: Lio’s arm

Robots and Rights

Robots have no rights from a philosophical and ethical point of view and cannot currently get any rights. You only have such rights if you can feel or suffer, if you have a consciousness or a will to live. Accordingly, animals can have certain rights, stones cannot. Only human beings have human rights. Certain animals can be granted basic rights, such as chimpanzees or gorillas. But to grant these animals human rights makes no sense. They are not human beings. If one day robots can feel or suffer, if they have a consciousness or a will to live, they must be granted rights. However, Oliver Bendel does not see any way to get there at the moment. According to him, one could at best develop „reverse cyborgs“, i.e. let brain and nerve cells grow on technical structures (or in a robot). Such reverse or inverted cyborgs might at some point feel something. The newspaper Daily Star dealt with this topic on 28 December 2018. The article can be accessed via www.dailystar.co.uk/news/latest-news/748890/robots-ai-human-rights-legal-status-eu-proposal.

Fig.: A human brain could be part of a reverse cyborg