The Mapping and Motion lab primarily works to create autonomous robots that can intelligently figure out what to do. To address this challenge, the M&M lab focuses on perceptual reasoning in robotic systems.
The M&M lab also is interested in collaborative projects with researchers in other fields which have interesting problems involving robust perception.
Aditya Pratap Singh, et al. Structure-Preserving Unpaired Image Translation to Photometrically Calibrate JunoCam with Hubble Data. https://m-and-m-lab.github.io/junocam-calibration/
Basavasagar Patil, et al. Using Temperature Sampling to Effectively Train Robot Learning Policies on Imbalanced Datasets. https://basavasagarkp.github.io/temp_samp/
Katrina Ashton, et al. HELIOS: Hierarchical Exploration for Language-grounded Interaction in Open Scenes. Â https://helios-robot-perception.github.io/