Automatic Classification of Unmanned Aerial Vehicles with Radars On-The-Move

Master Thesis (2022)
Author(s)

H. Haifawi (TU Delft - Electrical Engineering, Mathematics and Computer Science)

Contributor(s)

Francesco Fioranelli – Mentor (TU Delft - Microwave Sensing, Signals & Systems)

O. Yarovyi – Mentor (TU Delft - Microwave Sensing, Signals & Systems)

Rob Van Der Meer – Mentor (Robin Radar Systems)

Julian F.P. Kooij – Graduation committee member (TU Delft - Intelligent Vehicles)

Faculty
Electrical Engineering, Mathematics and Computer Science
Copyright
© 2022 Hani Haifawi
More Info
expand_more
Publication Year
2022
Language
English
Copyright
© 2022 Hani Haifawi
Graduation Date
22-08-2022
Awarding Institution
Delft University of Technology
Programme
['Electrical Engineering | Signals and Systems']
Faculty
Electrical Engineering, Mathematics and Computer Science
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Drone detection and tracking systems are nowadays a requirement in most public, private and political events, because of the increasing risk of unintentional or malicious misuse of these platforms. Moreover, in order to ensure adequate protection, full spatial coverage is a must for every such system. However, the research literature focuses on staring radars that have a limited field of view, but which yield rich target information via time-frequency distributions that facilitate the target recognition task. In this thesis, surveillance radars that offer full spatial coverage are presented, albeit their usage for classification is made more complex because of the rotating nature of their antennas which limits the dwell time on targets.

Additionally, due to the incredible fast growth of the drone market, novel counter-drone radars that are able to jointly localize and classify small targets while on-the-move now represent a highly in-demand remote sensing system. Nonetheless, surveillance sensors anchored on moving vehicles are a brand-new technology that is currently being developed. This work therefore investigates surveillance systems in a novel scenario, and presents the technological challenges alongside the proposed solutions to achieve reliable object detection via grounded counter-drone radars on-the-move. Specifically, the required pre-processing steps to remove the clutter from the data while the radar is rotating and moving on the ground are developed and discussed.

In the end, the joint detection and classification problem is traditionally solved separately by different algorithms due to the computational complexity of the task. This thesis project presents a novel framework that localizes and labels drones in an unified pipeline under the umbrella of object detection via computer vision, and that is able to operate while being static or on-the-move. Thus, an end-to-end radar data processing architecture that is robust against homogeneity constraints and based on You Only Look Once (YOLO) model is used to perform object detection in real-time. In brief, this work opens new avenues towards multi-class and multi-instance plot-based target detection and classification by transferring cross-disciplinary algorithms from computer vision into remote sensing.

Files

MSc_Thesis_HH.pdf
(pdf | 28.3 Mb)
- Embargo expired in 22-08-2024
License info not available