Novel Sensors for Autonomous Vehicle Perception
Room 140C
Overview
Current systems for autonomous vehicle (AV) perception rely on widely used sensors such as cameras and LiDAR systems. However, AV perception systems based on conventional sensors suffer in various lighting and adverse weather conditions. For example, conventional visual frame-based cameras are widely used due to their low-cost and high spatial resolution. Yet, AV perception systems based on conventional cameras are challenged by low-light conditions, fog, rain, and snow, and suffer from issues including motion blur, limited frame rates, and low dynamic range. Novel sensors, such as event and thermal cameras, offer alternatives that address these limitations. Event cameras feature high temporal resolution, effectively no motion blur, high dynamic range, low power requirements, and low latency. These qualities make event cameras promising under adverse lighting, fast motion, and resource constraints. Thermal cameras sensitive to the Long Wave Infrared spectrum can operate in the absence of visible light and are robust to changes in illumination and visual obscurants such as fog, dust, and smoke. Alongside these advantages are challenges: thermal cameras are lower contrast and experience greater motion blur, while event cameras represent a new paradigm that is not directly compatible with existing algorithms. Advancing AV capabilities to leverage novel sensors will require further research development across academia and industry. This workshop seeks to engage the research community in discussion of exciting new research topics in novel sensors for AV perception, and will feature the release of a unique dataset for AV perception with novel sensors, and a student poster session.
Schedule
Poster Presentations
CycleGAN-based Imaging Radar to LiDAR Image Translation for 2D Extrinsic Calibration, Sangwoo Jung, Hyesu Jang, Minwoo Jung, Myung-Hwan Jeon and Ayoung Kim (Seoul National University).
Infrared Random Dot Markers for Autonomous Landing in Extreme Condition, Kalliyan Velasco, Joshua G Mangelson, David Akagi and Tim McLain (Brigham Young University)
CSTR: a Compact Spatio-Temporal Representation for Event-Based Vision, Zaid A El Shair, Ali Hassanim and Samir Rawashdeh (University of Michigan - Dearborn)
Multispectral Deep Neural Network Fusion for Low-Light Object Detection, Keval Thaker, Sumanth Chennupati, Nathir Rawashdeh and Samir Rawashdeh (University of Michigan - Dearborn)
Soft Wheel with Embedded Acoustic Sensing for Terrain Mapping and Obstacle Detection, Wilfred A. Mason, Olivier St-Martin Cormier, David Brenken and Audrey Sedal (McGill University)
HeLiPR: Heterogeneous LiDAR Dataset for inter-LiDAR Place Recognition in Spatial and Temporal Variation Environments, Minwoo Jung, Dongjae Lee, Wooseong Yang and Ayoung Kim (Seoul National University)
LoRa Signal Strength Based Estimation using Non-Parametric Mapping and Localization, Derek R Benham, Ashton Palacios, Ethan Smith, Philip Lundrigan and Joshua G Mangelson (Brigham Young University)
Organizers
•Katherine (Katie) Skinner, Assistant Professor, Department of Robotics, University of Michigan
•Ram Vasudevan, Associate Professor, Department of Robotics, University of Michigan
•Manikandasriram (Mani) Srinivasan Ramanagopal, Project Scientist, Robotics Institute, Carnegie Mellon University
•Spencer Carmichael, PhD Candidate, Department of Robotics, University of Michigan
•Austin Buchan, Research Engineer, Department of Robotics, University of Michigan
•Radhika Ravi, Senior Research Engineer, Samsung
•Gaurav Pandey, Technical Leader, Ford Motor Company
•Alexandra (Alexa) Carlson, Research Scientist, Ford Motor Company
Datasets and Resources
The Novel Sensors for Autonomous Vehicle Perception dataset collected for this workshop includes forward-facing stereo uncooled thermal cameras (FLIR Boson 640+ ADK), event cameras (iniVation DVXplorer), monochrome cameras (FLIR BFS-PGE-16S2M), and RGB cameras (FLIR BFS-PGE-50S5C) aligned with ground truth pose from a high precision navigation system (Applanix POS-LV 420). Sequences include ~10 km routes driven repeatedly under varying lighting conditions and feature instances of direct sunlight and low-light that challenge conventional cameras. To our knowledge, this new dataset is the first to include stereo thermal cameras together with stereo event cameras and stereo monochrome cameras, which perform better in low-light than RGB cameras. The dataset is published at the University of Michigan Library services Deep Blue Data repository. You may need to use Globus, a software tool for downloading large (>3GB) dataset files.
A GitHub repository is available with supporting documentation and software tools for converting, visualizing, and processing the above datasets.
Call for Participation
Option 1: Novel Sensors Dataset Challenge
To enable new research in the area of novel sensors for autonomous vehicles, this workshop features the release of a new dataset for research on novel sensors for autonomous vehicle perception. We invite extended abstract and full paper submissions for work that applies prior research to this new dataset, or for work that leverages this dataset for novel research. Extended abstracts will be 2-4 pages including references and will be encouraged for preliminary or ongoing work. Full paper submissions will be 6-8 pages including references. Accepted submissions will have a chance to present their work through a poster session.
Option 2: Student Poster Session
We invite participation from students participating in university research through a student poster session. The student poster session will be focused on the broad topic of “Novel Sensors for Autonomous Vehicle Perception” and we welcome submissions across the full range of topics of interest. Submissions can be in the form of extended abstracts or full paper submissions. Extended abstracts will be 2-4 pages including references and will be encouraged for preliminary or ongoing work. Full paper submissions will be 6-8 pages including references. Accepted papers will not be published. The student poster session will be hosted by industry organizers from Ford Motor Company. This will be a great networking opportunity for students to discuss their work with researchers from industry.
The workshop will feature invited short talks, selected from both the dataset challenge submissions and the student poster session submissions. This will help to highlight outstanding research across the workshop submissions. Selected presenters will receive free workshop / tutorial registration!
Deadlines
Extended to September 1st at 11:59 PM Pacific Time - Submissions Due
September 11th at 11:59 PM Pacific Time - Decisions Released
September 22nd at 11:59 PM Pacific Time - Final Presentations/Posters Due
October 1st from 1:30 - 5:30 PM Eastern Time - Workshop at IROS 2023
Submission Link: https://cmt3.research.microsoft.com/NOVELSENSORS2023