Towards Mobile Robot Deployments with Goal Autonomy in Search-and-Rescue

Discovering Tasks and Constructing Actionable Environment Representations using Situational Affordances

Master Thesis (2022)
Author(s)

W.J. Meijer (TU Delft - Mechanical Engineering)

Contributor(s)

N. Yorke-Smith – Graduation committee member (TU Delft - Algorithmics)

Carlos Hernandez Hernandez Corbato – Graduation committee member (TU Delft - Robot Dynamics)

Jeroen Fransman – Mentor (TU Delft - Team Bart De Schutter)

J. Sijs – Mentor (TU Delft - Learning & Autonomous Control)

Faculty
Mechanical Engineering
Copyright
© 2022 Wouter Meijer
More Info
expand_more
Publication Year
2022
Language
English
Copyright
© 2022 Wouter Meijer
Graduation Date
21-10-2022
Awarding Institution
Delft University of Technology
Project
Safe autoNomous systems in the Open World (SNOW)
Programme
Mechanical Engineering | Vehicle Engineering | Cognitive Robotics
Sponsors
TNO
Faculty
Mechanical Engineering
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

In this work, we address the challenges of employing robots in the Search-and-Rescue (SAR) domain, where they can benefit rescue workers to quickly obtain Situational Awareness (SA). Missions with autonomous mobile robots are heavily dependent on environmental representations. Representations have been steadily increasing in the richness that they can capture, in addition to geometry we can now represent objects and their interrelations. These richer environmental representations, such as the 3D scene graph, have provided an opportunity to integrate representations with prior knowledge to make them more actionable and improve the SA they provide. The use of such representations for planning is limited. In previous work, the main approach was to augment and extend the scene graph to enable traditional symbolic planning. The limitations of these methods are that they are offline, scale poorly, and require full observability, making them unsuitable for the SAR domain. The main contributions of this work are as follows. First, we propose the behavior-oriented situational graph, a data structure that integrates data-driven perception with prior knowledge following a novel situational affordance schema. This schema connects situations with robot behaviors and mission objectives, allowing for autonomous mission planning. Second, we propose an efficient method to obtain task utilities from the proposed behavior-oriented situational graph through planning. Finally, we propose an exploration component to discover and select tasks online in dynamic environments that are potentially partially observable. This work is evaluated in several simulation scenarios, showing improved efficiency in mission completion compared to offline methods for the specific SAR domain. Finally, the methods are implemented and tested in a real-world scenario using a mobile Spot robot, showing its effectiveness in practice.

Files

License info not available