Optical Flow Estimation Using Event-Based Cameras

Improving Optical Flow Estimation Accuracy Using Space-Aware De-Flickering

More Info
expand_more

Abstract

Event cameras are novel sensors whose high temporal resolution and bandwidth motivate their use for the optical flow estimation problem. However, the properties of event cameras also introduce a vulnerability to flickering. Flickering hurts the perceptibility of motion by overwhelming event data with unrelated information. The single existing event de-flicker method (EFR) is built for scenarios where the relative position of the camera and the flickering object is constant, which is uncommon in motion-heavy optical flow estimation scenarios. Our contribution is a new de-flickering method that incorporates spatial awareness of nearby pixels. We hypothesize this feature to increase robustness to movement, and thus to better improve optical flow accuracy. Compared to EFR our method falters at filtering intensely flickering surfaces, but better preserves the spatial coherence of edges. However, we observe that both de-flickering methods remove much geometric information, especially given slow motion or weak ambient illumination. Our benchmarking shows that neither our method nor EFR significantly affects optical flow estimation accuracy, despite reducing event counts by 50-65%. Overall, we conclude that the niche benefits of spatial filtering are nullified by the result that filtering hardly affects optical flow estimation.