MetaRoom Dataset: Robustness Certification and Verification against Camera Motion Perturbation
News and updates
Jan. 2024: New work about pixel-wise smoothing against camera motion perturbation is accepted to AISTATS 2024, empowered by MetaRoom dataset
June. 2023: Our work about sound formal robustness verification is accepted to 2023 ICML 2nd Workshop on Formal Verification of Machine Learning
May 2023: MetaRoom Benchmark is on for VNN-COMP 2023! Check for details.
Sept. 2022: Our work about robust certification with camera motion smoothing is accepted to CoRL 2022
Introduction
MetaRoom dataset is a realistic indoor object recognition dataset with dense point cloud maps, which enables image projections from different camera poses. MetaRoom dataset is used for the challenging robustness certification and verification against camera motion perturbations along the x-axis, y-axis, or z-axis translation and rotation in real-world robotic applications.
Specifically, the MetaRoom dataset contains 20 commonly-seen indoor objects, placed on a table in the center of an empty room based on Webots. For each object, an RGBD camera moves around the object randomly and captures images as the training set, while reconstructing 0.0025m-density point cloud maps at 6 fixed positions and orientations based on the collected camera poses and intrinsic parameters s as the test set. Note that the camera poses in the training set are at least 15 degrees away from any yaw, pitch, or roll angle of every pose in the test set. More details can be found here.
Visualization of Dense Point Cloud Map
Two demos of the dense point cloud for object Telephone are shown below. Note that the demos are just to show the density of the point cloud and the camera is much farther away from the object in the point clouds collected in the dataset.
Visualization of Projected Images
The projected images under the 6-axis perturbations are shown below. From left to right: panoptic image, perturbations along z-axis translation, x-axis translation, y-axis translation, z-axis rotation, x-axis rotation and y-axis rotation.
Download
Download the dataset from here which includes camera motion augmented and vanilla dataset with 6 camera motions. Under each dataset, there are training, validation set of images and certification set of point clouds, camera poses and intrinsics as pickle files for test. Note that training and validation sets are captured directly from Webots while the certification (test) set is projected from point clouds for robustness certification and verification. The detailed usage of the dataset can be found on GitHub.
Citation
If you find MetaRoom dataset useful, please cite:
H. Hu, C. Liu, and D. Zhao "Robustness Verification for Perception Models against Camera Motion Perturbations", ICML WFVML 2023
@inproceedings{hu2023robustness,
title={Robustness Verification for Perception Models against Camera Motion Perturbations},
author={Hu, Hanjiang and Liu, Changliu and Zhao, Ding},
booktitle={ICML Workshop on Formal Verification of Machine Learning (WFVML)},
year={2023}
}
H. Hu, Z. Liu, L. Li, J. Zhu and D. Zhao "Robustness Certification of Visual Perception Models via Camera Motion Smoothing", CoRL 2022
@InProceedings{pmlr-v205-hu23b,
title = {Robustness Certification of Visual Perception Models via Camera Motion Smoothing},
author = {Hu, Hanjiang and Liu, Zuxin and Li, Linyi and Zhu, Jiacheng and Zhao, Ding},
booktitle = {Proceedings of The 6th Conference on Robot Learning},
pages = {1309--1320},
year = {2023},
volume = {205},
series = {Proceedings of Machine Learning Research},
month = {14--18 Dec},
publisher = {PMLR}
}