Articulated objects, as prevalent entities in human life, their 3D representations play crucial roles across various applications. However, achieving both high-fidelity textured surface reconstruction and dynamic generation for articulated objects remains challenging for existing methods. In this paper, we present REArtGS, a novel framework that introduces additional geometric and motion constraints to 3D Gaussian primitives, enabling high-quality textured surface reconstruction and generation for articulated objects. Specifically, given multi-view RGB images of arbitrary two states of articulated objects, we first introduce an unbiased Signed Distance Field (SDF) guidance to regularize Gaussian opacity fields, enhancing geometry constraints and improving surface reconstruction quality. Then we establish deformable fields for 3D Gaussians constrained by the kinematic structures of articulated objects, achieving unsupervised generation of surface meshes in unseen states. Extensive experiments on both synthetic and real datasets demonstrate our approach achieves high-quality textured surface reconstruction for given states, and enables high-fidelity surface generation for unseen states. Codes will be released within the next four months.
Video
Qualitative Results of Surface Mesh Reconstruction on PartNet-Mobility Dataset
Qualitative Results of Surface Mesh Generation on PartNet-Mobility Dataset
Qualitative Results on Real-World AKB-48 Dataset
Dynamic Rendering Results on PartNet-Mobility Dataset
USB
Laptop
Scissor
Storage
Fridge
Oven
Stapler
Washer
FoldChair
Blade
Dynamic Rendering Results on Real-World AKB-48 Dataset
Box
Scissor
Stapler