Distributed multi-target tracking and active perception with mobile camera networks

Journal Article (2024)
Author(s)

Sara Casao (Universidad de Zaragoza)

A. Serra Gomez (TU Delft - Learning & Autonomous Control)

Ana C. Murillo (Universidad de Zaragoza)

J.W. Böhmer (TU Delft - Algorithmics)

J. Alonso-Mora (TU Delft - Learning & Autonomous Control)

Eduardo Montijano (Universidad de Zaragoza)

Research Group
Learning & Autonomous Control
Copyright
© 2024 S. Casao, A. Serra Gomez, Ana C. Murillo, J.W. Böhmer, J. Alonso-Mora, Eduardo Montijano
DOI related publication
https://doi.org/10.1016/j.cviu.2023.103876
More Info
expand_more
Publication Year
2024
Language
English
Copyright
© 2024 S. Casao, A. Serra Gomez, Ana C. Murillo, J.W. Böhmer, J. Alonso-Mora, Eduardo Montijano
Research Group
Learning & Autonomous Control
Volume number
238
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Smart cameras are an essential component in surveillance and monitoring applications, and they have been typically deployed in networks of fixed camera locations. The addition of mobile cameras, mounted on robots, can overcome some of the limitations of static networks such as blind spots or back-lightning, allowing the system to gather the best information at each time by active positioning. This work presents a hybrid camera system, with static and mobile cameras, where all the cameras collaborate to observe people moving freely in the environment and efficiently visualize certain attributes from each person. Our solution combines a multi-camera distributed tracking system, to localize with precision all the people, with a control scheme that moves the mobile cameras to the best viewpoints for a specific classification task. The main contribution of this paper is a novel framework that exploits the synergies that result from the cooperation of the tracking and the control modules, obtaining a system closer to the real-world application and capable of high-level scene understanding. The static camera network provides global awareness of the control scheme to move the robots. In exchange, the mobile cameras onboard the robots provide enhanced information about the people on the scene. We perform a thorough analysis of the people monitoring application performance under different conditions thanks to the use of a photo-realistic simulation environment. Our experiments demonstrate the benefits of collaborative mobile cameras with respect to static or individual camera setups.