Neuromorphic Computing of Event-Based Data for High-Speed Vision-Based Navigation

More Info
expand_more

Abstract

The combination of Spiking Neural Networks and event-based vision sensors holds the potential of highly efficient and high-bandwidth optical flow estimation. This thesis presents, to the best of the author’s knowledge, the first hierarchical spiking architecture in which motion (direction and speed) selectivity emerges in a biologically plausible unsupervised fashion from the stimuli generated with an event-based camera. A novel adaptive neuron model and Spike-Timing-Dependent Plasticity formulation are at the core of this neural network governing its spike-based processing and learning, respectively. After convergence, the neural architecture exhibits the main properties of biological visual motion systems: feature extraction and local and global motion perception. To assess the outcome of the learning, a shallow conventional Artificial Neural Network is trained to map the activation traces of the penultimate layer to the optical flow visual observables of ventral flows. The proposed solution is validated for simulated event sequences with ground truth measurements. Experimental results show that accurate estimates of these parameters can be obtained over a wide range of speeds.