Dismantling Digital Cages

Examining Design Practices for Public Algorithmic Systems

Conference Paper (2022)
Author(s)

S.J.J. Nouws (TU Delft - Information and Communication Technology)

M.F.W.H.A. Janssen (TU Delft - Engineering, Systems and Services)

R.I.J. Dobbe (TU Delft - Information and Communication Technology)

Research Group
Information and Communication Technology
Copyright
© 2022 S.J.J. Nouws, M.F.W.H.A. Janssen, R.I.J. Dobbe
DOI related publication
https://doi.org/10.1007/978-3-031-15086-9_20
More Info
expand_more
Publication Year
2022
Language
English
Copyright
© 2022 S.J.J. Nouws, M.F.W.H.A. Janssen, R.I.J. Dobbe
Research Group
Information and Communication Technology
Bibliographical Note
Green Open Access added to TU Delft Institutional Repository ‘You share, we take care!’ – Taverne project https://www.openaccess.nl/en/you-share-we-take-care Otherwise as indicated in the copyright section: the publisher is the copyright holder of this work and the author uses the Dutch legislation to make this work public.@en
Pages (from-to)
307-322
ISBN (print)
9783031150852
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Algorithmic systems used in public administration can create or reinforce digital cages. A digital cage refers to algorithmic systems or information architectures that create their own reality through formalization, frequently resulting in incorrect automated decisions with severe impact on citizens. Although much research has identified how algorithmic artefacts can contribute to digital cages and their unintended consequences, the emergence of digital cages from human actions and institutions is poorly understood. Embracing a broader lens on how technology, human activity, and institutions shape each other, this paper explores what design practices in public organizations can result in the emergence of digital cages. Using Orlikowski’s structurational model of technology, we found four design practices in observations and interviews conducted at a consortium of public organizations. This study shows that design processes of public algorithmic systems (1) are often narrowly focused on technical artefacts, (2) disregard the normative basis for these systems, (3) depend on involved actors’ awareness of socio-technics in public algorithmic systems, (4) and are approached as linear rather than iterative. These four practices indicate that institutions and human actions in design processes can contribute to the emergence of digital cages, but also that institutional – opposed to technical – possibilities to address their unintended consequences are often ignored. Further research is needed to examine how design processes in public organizations can evolve into socio-technical processes, can become more democratic, and how power asymmetries in the design process can be mitigated.

Files

978_3_031_15086_9_20.pdf
(pdf | 0.338 Mb)
- Embargo expired in 30-04-2023
License info not available