Behavior Trees for Evolutionary Robotics
Kirk Scheper (TU Delft - Control & Simulation)
S Tijmons (TU Delft - Control & Simulation)
C. C. Visser (TU Delft - Control & Simulation)
G. C. H. E. de Croon (TU Delft - Control & Simulation)
More Info
expand_more
Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.
Abstract
Evolutionary Robotics allows robots with limited sensors and processing to tackle complex tasks by means of sensory-motor coordination. In this article we show the first application of the Behavior Tree framework on a real robotic platform using the evolutionary robotics methodology. This framework is used to improve the intelligibility of the emergent robotic behavior over that of the traditional neural network formulation. As a result, the behavior is easier to comprehend and manually adapt when crossing the reality gap from simulation to reality. This functionality is shown by performing real-world flight tests with the 20-g DelFly Explorer flapping wing micro air vehicle equipped with a 4-g onboard stereo vision system. The experiments show that the DelFly can fully autonomously search for and fly through a window with only its onboard sensors and processing. The success rate of the optimized behavior in simulation is 88%, and the corresponding real-world performance is 54% after user adaptation. Although this leaves room for improvement, it is higher than the 46% success rate from a tuned user-defined controller.