Introduction To Event Detection Cameras
Sign up for the tutorial at the IEEE International Conference on Computer Vision 2021!
Keigo Hirakawa
Professor
University of Dayton
khirakawa1 {at} udayton {dot} edu
Davide Migliore
Technical Business Developer
Senior Computer Vision Engineer
Prophesee
dmigliore {at} prophesee {dot} ai
What is an event detection camera?
Event detection cameras that emerged out of biologically inspired visual perception offer a rich area of engineering innovation, research, and applications. Event-based vision sensors (EVS) generates a sparse asynchronous data stream reporting temporal log-intensity changes (or events) of the pixel-sized photodiodes. EVS enjoys a far wider dynamic range (>100dB) and temporal resolution (>800kHz) compared to the familiar active pixel sensors (APS), while lacking the notion of pixel intensity. Because conventional intensity-based image processing and computer vision techniques designed for APS will fail for EVS, a new set of EVS-specific tools need to be developed.
About the tutorial
We will characterize operating characteristics of EVS, and establish details needed for researchers to begin working with event data. The goal of this proposed tutorial is to help the practitioners understand the key benefits and the characteristics of event detection cameras, establish a working knowledge of how event detection sensor data is processed, and develop a starting point for working with event data in their work environments. We cover common computational technique to work with sparse event data, from foundation to examples of state-of-the-art applications of event cameras.
Who is it for? What will I learn?
Event detection cameras are relatively new, and while the technology has gained the attention of many, tutorial resources are still limited. The technology is significantly different from conventional cameras and the “hurdle” for entry for working with event cameras are relatively high. Thus this tutorial is designed to bridge the knowledge gap of practitioners who already work in image processing and computer vision by providing a solid foundation for working with event data. We cover topics comprehensively (e.g. sensor architecture, data acquisition, signal processing, feature extraction, machine learning, applications), yet also have specific modules aimed at providing tools for tutorial participants to start working with event cameras.
Outline and list of topics to be covered (subject to change)
Introduction
Introduction to event based imaging
Hardware: Event Based Sensor (EVS) and its variants
Sensor technology roadmap
Biologically inspired visual perception
Spiking neural networks and convolutional neural networks
Event Representation
EVS forward model and simulation methods
Event representation in machine learning
Event filtering
Computational Imaging
Frame reconstruction
Optical flow
Deblurring
Stereo matching and 3D reconstruction
Hybrid frame and event detection camera setup
Computer Vision
Event-based feature extraction
Tracking and SLAM
Object detection, classification
Human pose estimation
Applications and Case Studies
Navigation and automotive
Automation
Robotics
"Quick start guide"
Hardware demonstration
Demonstration of Prophesee Metavision platform
Event processing workflow
Conclusion
Discussions, research opportunities, technology growth
Q&A