Robots, institutional roles and joint action

some key ethical issues

Journal Article (2025)
Author(s)

SRM Miller (University of Oxford, TU Delft - Ethics & Philosophy of Technology, Charles Sturt University)

Research Group
Ethics & Philosophy of Technology
DOI related publication
https://doi.org/10.1007/s10676-024-09816-z
More Info
expand_more
Publication Year
2025
Language
English
Research Group
Ethics & Philosophy of Technology
Issue number
1
Volume number
27
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

In this article, firstly, cooperative interaction between robots and humans is discussed; specifically, the possibility of human/robot joint action and (relatedly) the possibility of robots occupying institutional roles alongside humans. The discussion makes use of concepts developed in social ontology. Secondly, certain key moral (or ethical—these terms are used interchangeably here) issues arising from this cooperative action are discussed, specifically issues that arise from robots performing (including qua role occupants) morally significant actions jointly with humans. Such morally significant human/robot joint actions, supposing they exist, could potentially range from humans and robots jointly caring for the infirm through to jointly killing enemy combatants.