Sparsity-Aware Occupancy Grid Mapping for Automotive Driving Using Radar-LiDAR Fusion

Conference Paper (2024)
Author(s)

P. Zhai (TU Delft - Signal Processing Systems)

G. Joseph (TU Delft - Signal Processing Systems)

N.J. Myers (TU Delft - Team Nitin Myers)

Çaǧan Önen (NXP Semiconductors)

Ashish Pandharipande (NXP Semiconductors)

Research Group
Signal Processing Systems
DOI related publication
https://doi.org/10.1109/SENSORS60989.2024.10785054
More Info
expand_more
Publication Year
2024
Language
English
Research Group
Signal Processing Systems
ISBN (electronic)
9798350363517
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

We tackle the problem of estimating a binary occupancy grid map by fusing point cloud data from LiDAR and radar sensors for automotive driving perception. To this end, we introduce two sparsity measurement models for fusion, formulating occupancy mapping as a sparse binary vector reconstruction problem. The first model jointly estimates a common map from all measurements, while the second assumes a shared map and an innovation component for each modality's measurements. We use the pattern-coupled sparse Bayesian learning algorithm to recover maps, leveraging the inherent sparsity and spatial dependencies in automotive occupancy maps. Numerical experiments on the RADIATE public dataset show that our fusion-based approach improves mapping accuracy compared to single-modality and high-level fusion mapping algorithms.

Files

Sparsity-Aware_Occupancy_Grid_... (pdf)
(pdf | 4.84 Mb)
- Embargo expired in 23-06-2025
License info not available