Visual route following for tiny autonomous robots

Journal Article (2024)
Author(s)

Tom van Dijk (TU Delft - Control & Simulation)

C de Wagter (TU Delft - Control & Simulation)

Guido C.H.E.de de Croon (TU Delft - Control & Simulation)

Research Group
Control & Simulation
DOI related publication
https://doi.org/10.1126/scirobotics.adk0310
More Info
expand_more
Publication Year
2024
Language
English
Research Group
Control & Simulation
Issue number
92
Volume number
9
Pages (from-to)
eadk0310
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Navigation is an essential capability for autonomous robots. In particular, visual navigation has been a major research topic in robotics because cameras are lightweight, power-efficient sensors that provide rich information on the environment. However, the main challenge of visual navigation is that it requires substantial computational power and memory for visual processing and storage of the results. As of yet, this has precluded its use on small, extremely resource-constrained robots such as lightweight drones. Inspired by the parsimony of natural intelligence, we propose an insect-inspired approach toward visual navigation that is specifically aimed at extremely resource-restricted robots. It is a route-following approach in which a robot's outbound trajectory is stored as a collection of highly compressed panoramic images together with their spatial relationships as measured with odometry. During the inbound journey, the robot uses a combination of odometry and visual homing to return to the stored locations, with visual homing preventing the buildup of odometric drift. A main advancement of the proposed strategy is that the number of stored compressed images is minimized by spacing them apart as far as the accuracy of odometry allows. To demonstrate the suitability for small systems, we implemented the strategy on a tiny 56-gram drone. The drone could successfully follow routes up to 100 meters with a trajectory representation that consumed less than 20 bytes per meter. The presented method forms a substantial step toward the autonomous visual navigation of tiny robots, facilitating their more widespread application.

Files

Scirobotics.adk0310.pdf
(pdf | 1.37 Mb)
- Embargo expired in 23-12-2024
License info not available