Towards Mobile Robot Deployments with Goal Autonomy in Search-and-Rescue

Discovering Tasks and Constructing Actionable Environment Representations using Situational Affordances

More Info
expand_more

Abstract

In this work, we address the challenges of employing robots in the Search-and-Rescue (SAR) domain, where they can benefit rescue workers to quickly obtain Situational Awareness (SA). Missions with autonomous mobile robots are heavily dependent on environmental representations. Representations have been steadily increasing in the richness that they can capture, in addition to geometry we can now represent objects and their interrelations. These richer environmental representations, such as the 3D scene graph, have provided an opportunity to integrate representations with prior knowledge to make them more actionable and improve the SA they provide. The use of such representations for planning is limited. In previous work, the main approach was to augment and extend the scene graph to enable traditional symbolic planning. The limitations of these methods are that they are offline, scale poorly, and require full observability, making them unsuitable for the SAR domain. The main contributions of this work are as follows. First, we propose the behavior-oriented situational graph, a data structure that integrates data-driven perception with prior knowledge following a novel situational affordance schema. This schema connects situations with robot behaviors and mission objectives, allowing for autonomous mission planning. Second, we propose an efficient method to obtain task utilities from the proposed behavior-oriented situational graph through planning. Finally, we propose an exploration component to discover and select tasks online in dynamic environments that are potentially partially observable. This work is evaluated in several simulation scenarios, showing improved efficiency in mission completion compared to offline methods for the specific SAR domain. Finally, the methods are implemented and tested in a real-world scenario using a mobile Spot robot, showing its effectiveness in practice.