Learning to Jump from Pixels
Gabriel Margolis1 Tao Chen1 Kartik Paigwar2 Xiang Fu1
Donghyun Kim1,3 Sangbae Kim1 Pulkit Agrawal1
1 Massachusetts Institute of Technology, 2 Arizona State University, 3 University of Massachusetts Amherst
ABSTRACT: Today's robotic quadruped systems can robustly walk over a diverse range of rough but continuous terrains, where the terrain elevation varies gradually. Locomotion on discontinuous terrains, such as those with gaps or obstacles, presents a complementary set of challenges. In discontinuous settings, it becomes necessary to plan ahead using visual inputs and to execute agile behaviors beyond robust walking, such as jumps. Such dynamic motion results in significant motion of onboard sensors, which introduces a new set of challenges for real-time visual processing. The requirements of agility and terrain awareness in this setting reinforce the need for robust control. We present Depth-based Impulse Control (DIC), a method for synthesizing highly agile visually-guided locomotion behaviors. DIC affords the flexibility of model-free learning but regularizes behavior through explicit model-based optimization of ground reaction forces. We evaluate performance both in simulation and in the real world.