Flying and Ground Robot Collaboration for Camera-based Search and Rescue

Master Thesis (2024)
Author(s)

B. Esteves Henriques (TU Delft - Aerospace Engineering)

Contributor(s)

A. Jamshidnejad – Mentor (TU Delft - Control & Simulation)

M. Baglioni – Graduation committee member (TU Delft - Control & Simulation)

Faculty
Aerospace Engineering
Copyright
© 2024 Bernardo Esteves Henriques
More Info
expand_more
Publication Year
2024
Language
English
Copyright
© 2024 Bernardo Esteves Henriques
Graduation Date
05-02-2024
Awarding Institution
Delft University of Technology
Programme
['Aerospace Engineering']
Faculty
Aerospace Engineering
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Search and Rescue (SaR) missions present challenges due to the complexity of the disaster scenarios. Most life losses and injuries occur in developing countries. Robotics has become indispensable for rapidly locating disaster victims. Combining flying and ground robots more effectively serves this purpose due to their complementary features. To this end, a cost-effective framework to perform conventional SaR tasks is presented. The method leverages You Only Look Once and video streams by an Unmanned Ground Vehicle (UGV) and an Unmanned Aerial Vehicle (UAV). In exploiting pose estimation to perform human depth estimation, the susceptibility of the algorithm to variations in poses was unveiled. In tracking object trajectories, the collaboration is advantageous in wide-area cluttered trajectories as opposed to narrow-area unobstructed trajectories. In mapping terrain elevation, errors drop significantly with the assistance of the UAV. Moving forward, devising adaptable strategies tailored to diverse SaR scenarios will be pivotal.

Files

License info not available