Robot Active Neural Sensing and Planning in Unknown Cluttered Environments


Real-world experiments demo

real_cabinet_experiment_1.mp4

Real-world Experiment 1


Our active neural sensing platform is composed of a UR5e manipulator arm with an in-hand Intel D435i depth camera. There are in total 7 unknown objects randomly placed in the cabinet. The robot is placed at location (-0.4, 0.3, 0.95)m. Robot viewpoint configuration, observation, and the reconstructed scene are exhibited in order from left to right. The step counter shows the current viewpoint ID on the top left corner.


real_cabinet_experiment_2.mp4

Real-world Experiment 2


Another experiment with 5 unknown objects randomly placed in the cabinet. The robot is placed at location (-0.4, -0.3, 0.95)m. Robot viewpoint configuration, observation, and the reconstructed scene are exhibited in order from left to right. The step counter shows the current viewpoint ID on the top left corner.


Simulation experiment demo

A viewpoint sequence generated by our VPFormer-based active sensing framework in a narrow, cluttered environment. Rows 1-4 exhibit the robot viewpoint configurations, in-hand camera observations, scene representations, and scene after object shape completion. Each column's results come from the same viewpoint at a specific time step. The final reconstructed scene is shown in bottom right figure.