HumanTHOR:
A Simulation Platform and Benchmark for
Human-Robot Collaboration in a Shared Workspace
Chenxu Wang∗, Boyuan Du∗, Jiaxin Xu,Peiyan Li, Huaping Liu
Chenxu Wang∗, Boyuan Du∗, Jiaxin Xu,Peiyan Li, Huaping Liu
Human-robot collaboration (HRC) in a shared workspace has become a common pattern in real-world robot applications and has garnered significant research interest. However, most existing studies for human-in-the-loop (HITL) collaboration with robots in a shared workspace are evaluated in either simplified game environments or physical platforms, which fall short in limited realistic significance or limited scalability, respectively. To support future studies, we build an embodied framework named HumanTHOR, which enables humans to act in the simulation environment through VR devices to support HITL collaborations in a shared workspace. To validate our system, we build a benchmark of everyday tasks and conduct a preliminary user study with two baseline algorithms. The results show that the robot can successfully assist humans in collaboration, which substantiates the significance of HRC. The comparison between different levels of baselines affirms that our system can adequately evaluate the capability of robots and serve as a benchmark for the calibration of robot algorithms while indicating the existing space for future study. In summary, our system provides a preliminary foundation for future research on HRC in the shared workspace.
Overview
Our video showcases the following:
The Overall System Architecture
Tour in Various Scenes
Operating Instructions
A Representative Case Demonstrating HRC in Our System
Human only
Human-robot collaboration
We present an example of loosely coupled HRC in a shared workspace, where the human wants to pick an apple and put it into the fridge. Compared to the sole human scenario, collaborating with a robot can significantly improve efficiency. Motivated by the ubiquitous requirement of studying HRC, we developed the HumanTHOR system to support future HRC studies.
Scene
This video demonstrates participants using VR to explore within a single scene. For other scenes, please refer the videos below.
This video demonstrates a multi-robot multi-target mobile manipulation task, where the human and robots are asked to collect and place 3 apples on the side table.
Since the green robot can not interact with the fridge, it reports to the human and lets the human check the fridge. In contrast, the red robot with manipulation capability can directly take the apple back.
This video demonstrates a room rearrangement task, where the participants first tour the room to record the object arrangement, and then recover the arrangement from a shuffled environment.
This video demonstrates a room tidying up task, where the human and robots rearrange objects according to the commonsense.