Unconventional Sensors in Robotics:

Perception for Online Learning, Adaptive Behavior and Cognition

(Virtual) 4th June 2020

Thanks to all speakers and attendants for their contribution to a very fruitful workshop!

Videos of the talks will be available soon.

Motivation:

Computer vision has greatly pushed robotic autonomy and is the basis of the recent successes in navigation, cognitive robotics, and robot learning. However, for many applications, conventional vision leads to a large computational overhead with high power consumption and latency. Moreover, some environments, e.g. underwater or medical tissue, pose considerable challenges to conventional vision. Such tasks as object manipulation, e.g., require sensory feedback “from the fingertips”, and for navigation non-visual cues are known to play an important role in animals. Unconventional vision in form of the Dynamic Vision Sensors recently became almost main stream and has been picked up by companies, enabling new prosthetic devices, agile drone control, and low-power surveillance systems. But what about all the other sensing modalities? Why are they still “unconventional”? Sometimes, we don’t know how to efficiently process their output. Sometimes, their price and availability limit our access. But most of the times, we simply don’t know about their existence.

This workshop aims at providing a forum for research beyond conventional robotic sensing, to share experience, provide an overview of available and emerging technologies, and to explore their potential and limitations, as well as desiderata for lacking sensors. Modalities like olfaction, chemical, electrosense, magnetic, wind, tactile, contact, pressure, and force will be considered in a number of invited presentations from research labs and companies. An overview of biological sensing will provide perspectives for possible future developments of sensing technology. Live demonstrations of existing working sensors will round the program.

Follow us on: