The Mapping and Motion lab primarily works to create autonomous robots that can perform embodied search.
To address this challenge, the M&M lab advances fundamental techniques in mobile manipulation, robust perception, exploration, and scene understanding. The M&M lab pursues collaborative projects with researchers in other fields for which these algorithmic advances are also relevant, primarily space applications and clinical medicine.
Aditya Pratap Singh, et al. Structure-Preserving Unpaired Image Translation to Photometrically Calibrate JunoCam with Hubble Data. https://m-and-m-lab.github.io/junocam-calibration/
Basavasagar Patil, et al. Using Temperature Sampling to Effectively Train Robot Learning Policies on Imbalanced Datasets. https://basavasagarkp.github.io/temp_samp/
Katrina Ashton, et al. HELIOS: Hierarchical Exploration for Language-grounded Interaction in Open Scenes. https://helios-robot-perception.github.io/