University of Hamburg
University of Hamburg
University of Hamburg
University of Canterbury
University of Calgary
Intelligent Virtual Agents (IVAs), which embody an artificial intelligence (AI) in a humanoid representation, have enormous potential for immersive extended reality (XR) environments to enable natural and engaging human-AI interactions. With the rapid advancements in large language models (LLMs) and visual language model’s (VLMs) capabilities in simulating human-like text responses, interest in anthropomorphic embodied IVAs has grown across extended reality (XR) research and application domains. This tutorial provides a timely opportunity to learn the theories and frameworks needed to start XR research around embodied IVAs. It also offers discussions and networking with attendees who share similar research interests, helping to establish a community for long-term collaboration and exchange.
March 22 14.00-15.30, Room 1 (325A)
14.00-14.10: Introduction
14.10-14.25: Keynote Talk
"Changes Through Intelligent Virtual Agents?" -- Dr. Susanne Schmidt, University of Canterbury
14.25-14.35: The latest Trend on Generative AI Technology for XR (Dr. Sebastian Rings)
14.35-15.05: Live Hands-On Tutorial: Building a Conversational Embodied IVA in XR (Dr. Ke Li & Dr. Sebastian Rings)
Step-by-step demonstration of building you own IVA using the Intelligent virtual agent SDK.
Hands-on experience with integration of AI models, character animation, and interaction design
Prior preparations recommended:
Bring a Windows device with Unity 6.2 installed
Clone the following Unity sample scene via git:
https://git.informatik.uni-hamburg.de/presence/public/iva-sdk-unity-template
Join our IVA Discord channel
The hand-on step-by-step tutorial is also avaliable on Youtube
15.05-15.30: Summary and Panel Discussion – Prof. Dr. Kangsoo Kim
Panelists: Prof. Tham Piumsomboon, Prof. Christos Mousas, Prof. Michael Neff