Radar Perception for Autonomous Unmanned Aerial Vehicles

A Survey

More Info
expand_more

Abstract

The advent of consumer and industrial Unmanned Aerial Vehicles (UAVs), commonly referred to as drones, has opened business opportunities in many fields, including logistics, smart agriculture, inspection, surveillance, and construction. In addition, the autonomous operations of UAVs reduce risks by minimizing the time spent by human workers in harsh environments and lowering costs by automating tasks. For reliability and safety, the drones must sense and avoid potential obstacles and must be capable of safely navigating in unknown environments. UAVs' perception requires reliability in various settings, such as high dust levels, humidity, intense sun glare, dark, and fog that can severely obstruct many conventional sensing methods. Radar systems have unique strengths; they can reliably estimate how far an object is and measure its relative speed via the Doppler effect. In addition, because radars exploit radio waves to sense, they perform well in rain, fog, snow, or smoky environments. This stands in contrast to optical technologies, such as cameras or LIght Detection And Ranging (Lidars), which are more susceptible to the same challenges as the human eye. This survey paper aims to address the signal processing challenges for the exploitation of radar systems in unmanned aerial vehicles for advanced perception, considering recent integration trends and technology capabilities. The focus is on signal processing techniques for low-cost and power-efficient radar sensors, which operate onboard the UAVs in real-Time to ensure their needs in terms of perception, situational awareness, and navigation. Additionally, we highlight the challenges that remain to be tackled and the opportunities that lie ahead in the search for a more efficient, safe, and autonomous way for UAVs to perceive and interact with the world.