Towards a Framework for Certification of Reliable Autonomous Systems

Journal Article (2021)
Author(s)

Michael Fisher (The University of Manchester)

Viviana Mascardi (University of Genova)

Kristin Yvonne Rozier (Iowa State University)

Bernd-Holger Schlingloff (Fraunhofer FOKUS, Humboldt-Universitat zu Berlin)

Michael Winikof (Victoria University of Wellington)

N. Yorke-Smith (TU Delft - Algorithmics)

Research Group
Algorithmics
Copyright
© 2021 Michael Fisher, Viviana Mascardi, Kristin Yvonne Rozier, Bernd-Holger Schlingloff, Michael Winikof, N. Yorke-Smith
DOI related publication
https://doi.org/10.1007/s10458-020-09487-2
More Info
expand_more
Publication Year
2021
Language
English
Copyright
© 2021 Michael Fisher, Viviana Mascardi, Kristin Yvonne Rozier, Bernd-Holger Schlingloff, Michael Winikof, N. Yorke-Smith
Research Group
Algorithmics
Issue number
1
Volume number
35
Pages (from-to)
1-65
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

A computational system is called autonomous if it is able to make its own decisions, or take its own actions, without human supervision or control. The capability and spread of such systems have reached the point where they are beginning to touch much of everyday life. However, regulators grapple with how to deal with autonomous systems, for example how could we certify an Unmanned Aerial System for autonomous use in civilian airspace? We here analyse what is needed in order to provide verified reliable behaviour of an autonomous system, analyse what can be done as the state-of-the-art in automated verification, and propose a roadmap towards developing regulatory guidelines, including articulating challenges to researchers, to engineers, and to regulators. Case studies in seven distinct domains illustrate the article.