ESP18: Fundamentals of Event Sensor Signal Processing

Mailing list (only for workshop participants): https://groups.google.com/d/forum/neuromorph-esp18

Goals

To develop fundamental theory and algorithms for event sensor signal processing that will allow event sensor application at limits of low signal to noise, e.g. in low light conditions, bioimaging, space applications, etc.

  1. Can we lay a practical mathematical foundation that allows deriving efficient event-driven signal processing algorithms, analogous to the Z-transform of DSP?
  2. Can we find (and quantify) better noise reduction (NR) algorithms than existing ones?
  3. Can we find general methods for adaptively controlling sensor parameters like threshold, bandwidth, and refractory period?
  4. Can we find better input representations for event camera data for CNNs?
  5. What can we do to combine DVS events with color vision?

Organizers

Tobi Delbruck (Univ. of Zurich and ETH Zurich) and Greg Cohen (Western Sydney Univ.)

Confirmed Invited Participants

  1. Prof. Yiannis Andreopoulus, (University College London) - signal processing on frame-free representations with event cameras
  2. Dr. David Mascarenas (Los Alamos Natl Labs) — dynamic structural analysis with DVS
  3. Prof. Cornelia Fermuller, Univ. Maryland
  4. Dr. Francisco Barranco, Univ. of Grenada
  5. Prof. Ryad Benosman, UPMC Paris
  6. Dr. Garrick Orchard, Natl Univ. of Singapore
  7. Greg Burman, inivation.com
  8. Alex Zhu, U Penn

Potential Projects

  1. Formulate theory of linear signal processing in event-driven form.
  2. Record moving object (e.g. pendulum, people) under range of lighting conditions ranging from bright light to nearly completely dark, and understand how to dynamically control event filtering and bias control to maximize tracking accuracy.
  3. Apply results to space satellite tracking and analysis of Ca++ fluorescence in vitro recordings.
  4. Apply adaptive filtering to real-world driving recordings to understand impact of filtering on steering wheel prediction accuracy.

Tutorials

  1. Using DVS, DAVIS and ATIS event cameras in jAER, cAER, python, and ROS (see below)
  2. Developing event-driven algorithms (noise filter and median tracking examples)
  3. Generating and using DNN training data from DAVIS

Equipment

  1. A variety of prototype event cameras from the sensors group at INI and inivation.
  2. Software toolboxes, including jAER, cEAR, ROS, and a tryout of a brand new real time python sensor interface
  3. The new SLASHER Traxxas fast stadium racer RC car platform
  4. A new stereo pan tilt DVS rig to explore task-based stereo tracking accuracy