Simultaneous drone localization and communication using visible light

More Info
expand_more

Abstract

Drones that perform complex autonomous movements require a perfect estimate of their current position. However, internal measurement unit (IMU) errors introduce drift in this estimate, leading to significant discrepancies between the predicted and actual location. Various solutions have been proposed to calibrate the IMU, including methods involving cameras and humans in the loop. This thesis suggests implementing a previously developed technique that involves projecting a precise static light polarisation grid into a room. Although this pattern is invisible to the human eye it can be observed using a polariser and colour sensor combination. A drone equipped with such a sensor setup can recalibrate for IMU drift by utilising the perceived polarisation patterns as optical landmarks.

The system design is further developed by exploring the potential of visible light communication (VLC) as an alternative to traditional radio frequency (RF) links for drone control. By leveraging the existing infrastructure used for the projection of the polarisation grid, a VLC link is integrated into the system. With the addition this work strives to fuse polarisation-based localisation and VLC,
setting the first steps in creating a fully visible light-based drone platform.

To validate the system, a prototype is created that achieves real-time simultaneous localisation and communication on an embedded drone. This is accomplished through machine learning based classification, a drone motion model, an optimised polarisation pattern enabling fast localisation and a noise-resistant VLC link. Experiments show a median 2D tracking error of 10cm using only light-based methods and a VLC link range of up to 2.5 meters under various conditions.