The seven troubles with norm-compliant robots

Journal Article (2023)
Author(s)

T.N. Coggins (TU Delft - Values Technology and Innovation, TU Delft - Ethics & Philosophy of Technology)

S. Steinert (TU Delft - Values Technology and Innovation, TU Delft - Ethics & Philosophy of Technology)

Research Group
Ethics & Philosophy of Technology
Copyright
© 2023 T.N. Coggins, S. Steinert
DOI related publication
https://doi.org/10.1007/s10676-023-09701-1
More Info
expand_more
Publication Year
2023
Language
English
Copyright
© 2023 T.N. Coggins, S. Steinert
Research Group
Ethics & Philosophy of Technology
Issue number
2
Volume number
25
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Many researchers from robotics, machine ethics, and adjacent fields seem to assume that norms represent good behavior that social robots should learn to benefit their users and society. We would like to complicate this view and present seven key troubles with norm-compliant robots: (1) norm biases, (2) paternalism (3) tyrannies of the majority, (4) pluralistic ignorance, (5) paths of least resistance, (6) outdated norms, and (7) technologically-induced norm change. Because discussions of why norm-compliant robots can be problematic are noticeably absent from the robot and machine ethics literature, this paper fills an important research gap. We argue that it is critical for researchers to take these issues into account if they wish to make norm-compliant robots.