OPAL: A Stereo Vision Obstacle Processing ALgorithm for a Walking Lunar Rover

More Info
expand_more

Abstract

The Lunar Zebro is a small six-legged robot. It has the potential to be used in a swarm carrying out objectives like exploring planetary surfaces. A new step towards autonomous navigation is made with the newly developed Obstacle Processing ALgorithm (OPAL) using primarily open-source libraries. This study showed that the initial iteration of OPAL could detect rocks and determine their absolute position to the rover’s low-positioned cameras using a stereo vision system. Obstacles and their relative distances are detected using the disparity map—the amount of shift of pixels in the stereo image pair. When translating the disparity to V-disparity, a histogram of the disparity per row, the ground and the obstacle could be isolated. It took six steps to realise this thesis goal. After setting the requirements, a test model, called Bars, was developed and tested at a location containing a Mars-like environment (Decos). This test model uses, along with Lunar Zebro’s hardware, mostly Commercial Off-The-Shelf products. With the footage, all the different components of OPAL were integrated into one algorithm. Hereafter, a pipeline on a server was created, and multiple test cases were run to establish results. The predetermined requirements of the algorithm were validated using measuring tape measurements and ground-truth bounding boxes tracked by a CSRT-algorithm. Together, Bars and the initial iteration of OPAL prove feasibility and expose opportunities and challenges, which could be a starting point for optimisations or other approaches.