Print Email Facebook Twitter Towards a Framework for Certification of Reliable Autonomous Systems Title Towards a Framework for Certification of Reliable Autonomous Systems Author Fisher, Michael (The University of Manchester) Mascardi, Viviana (University of Genova) Rozier, Kristin Yvonne (Iowa State University) Schlingloff, Bernd-Holger (Humboldt-Universitat zu Berlin; Fraunhofer FOKUS) Winikof, Michael (Victoria University of Wellington) Yorke-Smith, N. (TU Delft Algorithmics) Date 2021 Abstract A computational system is called autonomous if it is able to make its own decisions, or take its own actions, without human supervision or control. The capability and spread of such systems have reached the point where they are beginning to touch much of everyday life. However, regulators grapple with how to deal with autonomous systems, for example how could we certify an Unmanned Aerial System for autonomous use in civilian airspace? We here analyse what is needed in order to provide verified reliable behaviour of an autonomous system, analyse what can be done as the state-of-the-art in automated verification, and propose a roadmap towards developing regulatory guidelines, including articulating challenges to researchers, to engineers, and to regulators. Case studies in seven distinct domains illustrate the article. Subject Artificial intelligenceAutonomous systemsCertificationVerification To reference this document use: http://resolver.tudelft.nl/uuid:5ad2715b-afa6-410b-9c6f-9c9f9376ba23 DOI https://doi.org/10.1007/s10458-020-09487-2 ISSN 1387-2532 Source Autonomous Agents and Multi-Agent Systems, 35 (1), 1-65 Part of collection Institutional Repository Document type journal article Rights © 2021 Michael Fisher, Viviana Mascardi, Kristin Yvonne Rozier, Bernd-Holger Schlingloff, Michael Winikof, N. Yorke-Smith Files PDF Fisher2020_Article_Toward ... ificat.pdf 2.12 MB Close viewer /islandora/object/uuid:5ad2715b-afa6-410b-9c6f-9c9f9376ba23/datastream/OBJ/view