ESVO: Event-based Stereo Visual Odometry



Abstract

Event-based cameras are bio-inspired vision sensors whose pixels work independently from each other and respond asynchronously to brightness changes, with microsecond resolution. Their advantages make it possible to tackle challenging scenarios in robotics, such as high-speed and high dynamic range scenes. We present a solution to the problem of visual odometry from the data acquired by a stereo event-based camera rig. Our system follows a parallel tracking-and-mapping approach, where novel solutions to each subproblem (3D reconstruction and camera pose estimation) are developed with two objectives in mind: being principled and efficient, for real-time operation with commodity hardware. To this end, we seek to maximize the spatio-temporal consistency of stereo event-based data while using a simple and efficient representation. Specifically, the mapping module builds a semi-dense 3D map of the scene by fusing depth estimates from multiple local viewpoints (obtained by spatio-temporal consistency) in a probabilistic fashion. The tracking module recovers the pose of the stereo rig by solving a registration problem that naturally arises due to the chosen map and event data representation. Experiments on publicly available datasets and on our own recordings demonstrate the versatility of the proposed method in natural scenes with general 6-DoF motion. The system successfully leverages the advantages of event-based cameras to perform visual odometry in challenging illumination conditions, such as low-light and high dynamic range, while running in real-time on a standard CPU. We release the software and dataset under an open source license to foster research in the emerging topic of event-based SLAM.

Video

New results on DSEC Dataset (27 May 2021)

Publications

Yi Zhou, Guillermo Gallego, and Shaojie Shen. "Event-based Stereo Visual Odometry."IEEE Transactions on Robotics 37, no. 6 (2021), Early Access.


@article{zhou2021event, title={Event-based stereo visual odometry}, author={Zhou, Yi and Gallego, Guillermo and Shen, Shaojie}, journal={IEEE Transactions on Robotics}, year={2021}, publisher={IEEE}}

Yi Zhou, Guillermo Gallego, Henri Rebecq, Laurent Kneip, Hongdong Li, and Davide Scaramuzza. "Semi-dense 3D reconstruction with a stereo event camera." In Proceedings of the European Conference on Computer Vision (ECCV), pp. 235-251. 2018.

@inproceedings{zhou2018semi, title={Semi-dense 3D reconstruction with a stereo event camera}, author={Zhou, Yi and Gallego, Guillermo and Rebecq, Henri and Kneip, Laurent and Li, Hongdong and Scaramuzza, Davide}, booktitle={Proc. European Conference on Computer Vision (ECCV)}, pages={235--251}, year={2018}}

Source Code

Together with the code, we also released the CAD design of the stereo rig in the github repository.

Dataset Download

The original rpg (University of Zurich) and upenn (University of Pennsylvania) datasets can be download from

For convenience, we provide the edited rosbag files used in the paper. The proposed edition is as described in the "readme" file under /events_repacking_helper.

edited rpg stereo dataset

edited upenn stereo dataset

hkust stereo dataset

License

The software is released under a GPLv3 license. For commercial use, please contact the authors.

Additional Resources

News

  • 2021/2/23 ESVO is integrated into the modular iniVation DV software platform. Download here: https://lnkd.in/deuRKSK

  • 2021/2/16 ESVO accepted by IEEE Transactions on Robotics (T-RO).

Contact

If you have any questions about this project, please contact

Yi Zhou (eeyzhou@ust.hk)

Guillermo Gallego

Shaojie Shen