To develop fundamental theory and algorithms for event sensor signal processing that will allow event sensor application at limits of low signal to noise, e.g. in low light conditions, bioimaging, space applications, etc.
- Can we lay a practical mathematical foundation that allows deriving efficient event-driven signal processing algorithms, analogous to the Z-transform of DSP?
- Can we find (and quantify) better noise reduction (NR) algorithms than existing ones?
- Can we find general methods for adaptively controlling sensor parameters like threshold, bandwidth, and refractory period?
- Can we find better input representations for event camera data for CNNs?
- What can we do to combine DVS events with color vision?