NeuSpike-Net: High Speed Video Reconstruction via Bio-inspired

Neuromorphic Cameras

Lin Zhu, Jianing Li, Xiao Wang, Tiejun Huang, Yonghong Tian*

[paper] [data] [video]

Abstract

Neuromorphic vision sensor is a new bio-inspired imaging paradigm emerged in recent years, which continuously sensing luminance intensity and firing asynchronous spikes (events) with high temporal resolution. Typically, there are two types of neuromorphic vision sensors, namely dynamic vision sensor (DVS) and spike camera. From the perspective of bio-inspired sampling, DVS only perceives movement by imitating the retinal periphery, while the spike camera was developed to perceive fine textures by simulating the fovea. It is meaningful to explore how to combine two types of neuromorphic cameras to reconstruct high quality images like human vision. In this paper, we propose a NeuSpike-Net to learn both the high dynamic range and high motion sensitivity of DVS and the full texture sampling of spike camera to achieve high-speed and high dynamic image reconstruction. We propose a novel representation to effectively extract the temporal information of spike and event data. By introducing the feature fusion module, the two types of neuromorphic data achieve complementary to each other. The experimental results on the simulated and real datasets demonstrate that the proposed approach is effective to reconstruct high-speed and high dynamic range images via the combination of spike and event data.

The motivation of our approach. Neuromorphic cameras are inspired by the retina. DVS perceives movement by imitating the retinal periphery, while the spike camera was developed to sense fine textures by simulating the fovea. In this work, we combine the spike and event data to achieve complementary and get better reconstruction quality.


Experimental results


If you find this work helpful for your research, please cite this work:

@article{-,

title={NeuSpike-Net: High Speed Video Reconstruction via Bio-inspired Neuromorphic Cameras},

author={Lin Zhu, Jianing Li, Xiao Wang, Tiejun Huang, Yonghong Tian},

journal={ICCV2021},

year={2021}

}