Passive Localization of Robots with Ambient Light

More Info
expand_more

Abstract

A lot of research is been being done on Visible Light Communication (VLC), which has shown to be of interest for many applications, such as localization. Since localization based on VLC requires active modulation of light sources, this limits the amount of light sources that can be used for localization. Furthermore, in some situations there might not even be a controllable light source present (for example outdoors). To extend the use of light-based localization schemes, this thesis looks into a way to achieve the same result as current VLC localization methods in a passive manner, i.e. without control of the light sources.

Previous work has been done on passive ambient light-based localization by Wang et al.: objects are equipped with unique barcodes, that reflect ambient light in a distinct manner. The reflected light is received by photosensors, from which their ID is obtained. However, this work has focused on identifying large-sized objects in one dimension. Using the same principle for localization of small-sized objects, and in two dimensions, are open challenges that this thesis addresses .

The work presented here forms a proof-of-concept of a passive light-based localization system for two-dimensional, real-time tracking of small-sized objects. In order to achieve this, a special enclosure has been designed, giving simple photosensors the ability to distinguish small-sized objects without compromising their FOV. With this enclosure, a single photosensor can detect barcodes down to 7 cm in size in the test set-up, while distinguishing up to three different IDs. A particle filter has been implemented to combine detections from different photosensors into a single estimate of an object’s location.

The localization system is designed around the robots designed by a MSc student at the Embedded Systems group at TU Delft. By moving these robots at a speed of 15.4 cm/s in a straight line through the test set-up, a localization error of 4.8 cm is obtained. The distance between the robots and the sensor equals 20 cm.