Real-time optical flow estimation on a nano quadcopter

More Info


Nano quadcopters are small, agile, and cheap platforms well suited for deployment in narrow, cluttered environments. Due to their limited payload, nano quadcopters are highly constrained in processing power, rendering conventional vision-based methods for autonomous navigation incompatible. Recent machine learning developments promise high-performance perception at low latency, while novel ultra-low power microcontrollers augment the visual processing power of nano quadcopters. In this work, we present NanoFlowNet, an optical flow CNN that, based on the semantic segmentation architecture STDC-Seg, achieves real-time dense optical flow estimation on edge hardware. We use motion boundary ground truth to guide the learning of optical flow, improving performance with zero impact on latency. Validation on MPI-Sintel shows the high performance of the proposed method given its constrained architecture. We implement the CNN on the ultra-low power GAP8 microcontroller and demonstrate it in an obstacle avoidance application on a 34 g Bitcraze Crazyflie nano quadcopter.