Abstract

Interactions play a key role in understanding objects and scenes, for both virtual and real world agents. We introduce a new general representation for proximal interactions among physical objects that is agnostic to the type of objects or interaction involved. The representation is based on tracking particles on one of the participating objects and then observing them with sensors appropriately placed in the interaction volume or on the interaction surfaces.We show how to factorize these interaction descriptors and project them into a particular participating object so as to obtain a new functional descriptor for that object, its interaction landscape, capturing its observed use in a spatio-temporal framework. Interaction landscapes are independent of the particular interaction and capture subtle dynamic effects in how objects move and behave when in functional use. Our method relates objects based on their function, establishes correspondences between shapes based on functional key points and regions, and retrieves peer and partner objects with respect to an interaction.

Downloads

Paper[preprint pdf], BibTex[txt].

Acknowledgements

We would like to thank Mirela Ben-Chen for her insightful comments and suggestions on methods for 3D vector fields and Torsten H¨adrich for his implementation of the SPH-based fluid solver. The work was supported by the NSF grants CCF-1514305, IIS-1528025, EEC-1606396, NSG-1161480, the Stanford AI Lab-Toyota Center for Artificial Intelligence Research, a Google Focused Research Award, the Max Planck Center for Visual Computing and Communications, the JSPS Strategic Young Researchers Visits Program for Acceleration Brain Circulations, and the National Science Foundation of China (61373071).