ESVO: Event-based Stereo Visual Odometry

Abstract. Event-based cameras are bio-inspired vision sensors whose pixels work independently from each other and respond asynchronously to brightness changes, with microsecond resolution. Their advantages make it possible to tackle challenging scenarios in robotics, such as high-speed and high dynamic range scenes. We present a solution to the problem of visual odometry from the data acquired by a stereo event-based camera rig. Our system follows a parallel tracking-and-mapping approach, where novel solutions to each subproblem (3D reconstruction and camera pose estimation) are developed with two objectives in mind: being principled and efficient, for real-time operation with commodity hardware. To this end, we seek to maximize the spatio-temporal consistency of stereo event-based data while using a simple and efficient representation. Specifically, the mapping module builds a semi-dense 3D map of the scene by fusing depth estimates from multiple local viewpoints (obtained by spatio-temporal consistency) in a probabilistic fashion. The tracking module recovers the pose of the stereo rig by solving a registration problem that naturally arises due to the chosen map and event data representation. Experiments on publicly available datasets and on our own recordings demonstrate the versatility of the proposed method in natural scenes with general 6-DoF motion. The system successfully leverages the advantages of event-based cameras to perform visual odometry in challenging illumination conditions, such as low-light and high dynamic range, while running in real-time on a standard CPU. We release the software and dataset under an open source license to foster research in the emerging topic of event-based SLAM.


New results on DSEC Dataset (27 May 2021)


Yi Zhou, Guillermo Gallego, Shaojie Shen, "Event-based Stereo Visual Odometry," IEEE Transactions on Robotics, vol. 37, no. 5, pp. 1433-1450, Oct. 2021.

@article{Zhou21tro, author={Zhou, Yi and Gallego, Guillermo and Shen, Shaojie}, title={Event-Based Stereo Visual Odometry}, journal={IEEE Transactions on Robotics}, year={2021}, volume={37}, number={5}, pages={1433-1450}, doi={10.1109/TRO.2021.3062252}}

Yi Zhou, Guillermo Gallego, Henri Rebecq, Laurent Kneip, Hongdong Li, Davide Scaramuzza, "Semi-dense 3D reconstruction with a stereo event camera." In Proc. European Conference on Computer Vision (ECCV), pp. 235-251. 2018.

@inproceedings{Zhou18eccv, title={Semi-dense 3D reconstruction with a stereo event camera}, author={Zhou, Yi and Gallego, Guillermo and Rebecq, Henri and Kneip, Laurent and Li, Hongdong and Scaramuzza, Davide}, booktitle={European Conference on Computer Vision (ECCV)}, pages={235-251}, year={2018}}

Source Code

Together with the code, we also released the CAD design of the stereo rig in the github repository.

Dataset Download

The original rpg (University of Zurich) and upenn (University of Pennsylvania) datasets can be download from

For convenience, we provide the edited rosbag files used in the paper. The proposed edition is as described in the "readme" file under /events_repacking_helper.

edited rpg stereo dataset

edited upenn stereo dataset

hkust stereo dataset


The software is released under a GPLv3 license. For commercial use, please contact the authors.

Additional Resources


  • 2021/2/23 ESVO is integrated into the modular iniVation DV software platform. Download here:

  • 2021/2/16 ESVO accepted by IEEE Transactions on Robotics (T-RO).


If you have any questions about this project, please contact

Yi Zhou (

Guillermo Gallego

Shaojie Shen