Sparsity-Aware Occupancy Grid Mapping for Automotive Driving Using Radar-LiDAR Fusion
P. Zhai (TU Delft - Signal Processing Systems)
G. Joseph (TU Delft - Signal Processing Systems)
N.J. Myers (TU Delft - Team Nitin Myers)
Çaǧan Önen (NXP Semiconductors)
Ashish Pandharipande (NXP Semiconductors)
More Info
expand_more
Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.
Abstract
We tackle the problem of estimating a binary occupancy grid map by fusing point cloud data from LiDAR and radar sensors for automotive driving perception. To this end, we introduce two sparsity measurement models for fusion, formulating occupancy mapping as a sparse binary vector reconstruction problem. The first model jointly estimates a common map from all measurements, while the second assumes a shared map and an innovation component for each modality's measurements. We use the pattern-coupled sparse Bayesian learning algorithm to recover maps, leveraging the inherent sparsity and spatial dependencies in automotive occupancy maps. Numerical experiments on the RADIATE public dataset show that our fusion-based approach improves mapping accuracy compared to single-modality and high-level fusion mapping algorithms.