ICIAP 2025 workshop on
HUMAN-OBJECT INTERACTION:
INTEGRATING EGOCENTRIC AND EXOCENTRIC PERSPECTIVES
ICIAP 2025 workshop on
HUMAN-OBJECT INTERACTION:
INTEGRATING EGOCENTRIC AND EXOCENTRIC PERSPECTIVES
Recent advancements in artificial intelligence (AI) and computer vision have significantly improved our ability to understand human-object interactions in both egocentric and exocentric perspectives. These developments have opened new frontiers in behavioural analytics, personalised experiences and intelligent retail environments, enabling more immersive and context-aware AI-driven applications. The combination of first-person vision from wearable devices and third-person vision from fixed cameras enables comprehensive analysis of user interactions, supporting use cases in user behaviour modelling and human-centric AI systems. The large-scale adoption of wearable sensors, AR/VR devices and smart technologies presents new challenges and opportunities for AI. How can egocentric vision contribute to personalised recommendations? How can ambient vision improve behaviour analysis in complex settings? What are the best methods to fuse multimodal data from different perspectives to build adaptive, human-centric AI systems? The goal of this workshop is to encourage and highlight novel strategies and original research in human-object interaction and behaviour understanding through egocentric and exocentric vision, with applications in retail, assistive technologies, smart environments, and AI-driven behavioural analytics.
The workshop calls for submissions addressing, but not limited to, the following topics:
Long-Term Instance-Based Interaction Understanding
Methods for discovering and tracking object instances over extended periods in streaming scenarios.
Moving beyond short-range, static analyses to holistic, memory-driven AI models.
Applications in cognitive assessment, smart retail, and personalized AI assistants.
Egocentric Vision for User-Centric Interaction Analysis
Algorithms and systems utilizing wearable devices (e.g., smart glasses, AR headsets) to capture first-person perspectives.
Automatic object discovery, interaction tracking, and formation of symbolic, high-level memories.
Applications in personalized shopping experiences, human-aware AI, and immersive retail engagement.
Exocentric Vision and Contextual Human Behavior Analysis
Approaches using fixed cameras and environmental sensors to understand group behaviors and environmental context.
Retail analytics applications such as customer flow tracking, engagement heatmaps, and store layout optimization.
Additional use cases in safety monitoring, workplace behavior analysis, and consumer interaction modeling.
Fusion of Egocentric and Exocentric Data
Integration of first-person and third-person viewpoints for a comprehensive analysis of human-object interactions.
Overcoming challenges such as occlusions, dynamic backgrounds, and real-world variability.
AI-driven techniques to enhance behavioral analytics, retail intelligence, and smart environment adaptation.
Visual Object Tracking Across Wearable and Ambient Domains
Advances in object tracking algorithms for operation across egocentric and exocentric perspectives.
Memory-driven tracking methodologies supporting cognitive training, user behavior modeling, and AI-powered retail recommendations.
Applications in personalized shopping, smart commerce, and interactive AI-driven environments.
Please see the Submissions and Dates page for details about the submissions.
This workshop brings together the communities of to the PRIN 2022 EXTRA-EYE and PRIN PNRR 2022 TEAM projects and serves as a forum for discussion and dissemination of the findings of such projects.
This event has been supported by the European Union - Next Generation EU, Mission 4 Component 1, through Project PRIN 2022 EXTRA-EYE, CUP D53D23008900001 (Università di Macerata), E53D23008280006 (Università di Catania), G53D23002920006 (Università di Udine), and Project PRIN 2022 PNRR TEAM, CUP G53D23006680001 (Università di Udine), E53D23016240001 (Università di Catania).
EPFL
Politecnico di Torino
Università Mercatorum
KPMG