Obstacle Detection and Avoidance onboard an MAV using a Monocular Event-based Camera

More Info
expand_more

Abstract

Micro Air Vehicles (MAVs) are able to support humans in dangerous operations, such as search and rescue operations at night on unknown terrain. These scenes require a great amount of autonomy from the MAV, as they are often radio and GPS-denied. As MAVs have limited computational resources and energy storage, onboard navigation tasks have to be performed efficient and fast. To address this challenge, this research proposes an approach to visual obstacle detection and avoidance onboard an MAV. The algorithmic approach is based on event-based optic flow, using a monocular event-based camera. This camera captures the apparent motion in the scene, has microsecond latency and very low power consumption, therefore a good fit for onboard navigation tasks. Firstly, a literature study is performed to provide theoretical concepts and foundation for the obstacle avoidance approach. A processing pipeline is designed, based on the use of event-based normal optic flow. This pipeline consists of three sections: course estimation, obstacle detection and obstacle avoidance. A novel course estimation method 'FAITH' is proposed which uses optic flow half-planes along with a fast RANSAC scheme. The object detection method is based on DBSCAN clustering of optic flow vectors, using the time-to-contact and vector location as clustering variables. The performance of these methods is experimentally demonstrated by three experiments: in a simulated environment, offline on real sensor data and online onboard an MAV. As currently no event-based obstacle avoidance datasets are publicly available, a dataset is recorded as supplement to this and future research. Approximately 1350 runs of event-based camera, RADAR, IMU and OptiTrack data are recorded, manually avoiding either a single or two poles using an MAV in the flying arena of the TU Delft. This dataset is used in this research to determine the performance of the course estimation method using real sensor data. The course estimation method is shown to have state-of-the-art accuracy and beyond state-of-the-art computation time on both simulated data and the recorded dataset. The final experiment shows the obstacle detection and avoidance approach integrated onboard an MAV in a real-time obstacle avoidance task. The approach is shown to have a success rate of 80% in a frontal obstacle avoidance task on a low-textured 50-cm wide pole. The contribution of this research is an obstacle detection and avoidance approach using a monocular event-based camera onboard an MAV, along with the novel course estimation algorithm 'FAITH'.