Behavior Trees for Evolutionary Robotics

More Info
expand_more

Abstract

Evolutionary Robotics allows robots with limited sensors and processing to tackle complex tasks by means of sensory-motor coordination. In this article we show the first application of the Behavior Tree framework on a real robotic platform using the evolutionary robotics methodology. This framework is used to improve the intelligibility of the emergent robotic behavior over that of the traditional neural network formulation. As a result, the behavior is easier to comprehend and manually adapt when crossing the reality gap from simulation to reality. This functionality is shown by performing real-world flight tests with the 20-g DelFly Explorer flapping wing micro air vehicle equipped with a 4-g onboard stereo vision system. The experiments show that the DelFly can fully autonomously search for and fly through a window with only its onboard sensors and processing. The success rate of the optimized behavior in simulation is 88%, and the corresponding real-world performance is 54% after user adaptation. Although this leaves room for improvement, it is higher than the 46% success rate from a tuned user-defined controller.