Fu-Jen Chu, Ruinian Xu, Landan Seguin and Patricio A. Vela
Georgia Institute of Technology, GA, U.S.A.
Abstract: This paper presents a framework to detect and rank affordances of novel objects to assist with robotic manipulation tasks. The framework segments the affordance map of unseen objects using region-based affordance segmentation. Detected affordances define an initial state from which to generate action primitives for manipulation via the Planning Domain Definition Language (PDDL). The proposed category-agnostic affordance segmentation approach generalizes learned affordances to unseen objects by utilizing binary classification on proposed instance masks. The predicted pixel-wise level affordances are ranked by KL-divergence, augmenting the available affordance choices for manipulation tasks with non-primary affordances of an object. Experimental results show that the proposed method achieves state-of-the-art performance on affordance segmentation of novel objects, and outperforms baselines on affordance ranking. Actual robotic manipulation scenarios demonstrate the use of affordance detection with PDDL-generated action primitives for task execution. Prediction of ranked affordances on unseen objects provides flexibility to accomplish goal-oriented tasks.
RA-L with IROS2019: paper link
Code: github
Manipulation with Affordances and PDDL
Supplementary Video
Affordance Detection and Ranking