We assembled a team of researchers from UC-Berkeley, Google, Amazon and MIT, to ask the question: Can we drastically reduce the cost of data annotation? In response, we have developed SimNet, a low-quality procedural simulator for tackling 3D perception for home robotics.
To enable this we first built a procedural home simulator, that can generate a wide range of home layouts, furniture and clutter.
Given synthetic 3D data, we can get annotations for room-level segmenation, OBBs of objects and the state of articulated objects. To enable transfer to the real world, we use our novel stereo-based sim2real technique. Below are predictions from our panoptic detector trained only on synthetic data.
We can then have our home robot use this dense 3D scene understanding to perform complex tasks in new homes, such as clearing a table.
Home Robot Blog Series Part 1
Home Robot Blog Sereies Part 2
Robot Block Party in Oakland
Michael Laskey
Kevin Stone
Thomas Kollar
Mark Tjersland
Interested in using SimNet for indoor scene understanding?
Please reach out to us at: simnet@tri.global