Intelligent Interaction is a multidisciplinary topic in which computer science meets social science to investigate, design and evaluate novel forms of multimodal human-computer interaction.
Research in Intelligent Interaction concerns the perception-action cycle of understanding human behaviours and generating system responses, supporting an ongoing dialogue with the user. Understanding the user –by automated evaluation of speech, pose, gestures, touch, facial expressions, social behaviours, interactions with other humans, bio-physical signals and all content humans create– should inform the generation of intuitive and satisfying system responses. By understanding how and why people use interactive media, interactive systems can be made more socially capable, safe, acceptable and fun. Evaluation of the resulting systems generally focuses on the perception that the user has of them and the experience that they engender. These issues are investigated through the design, implementation, and analysis of systems across different application areas and across a variety of contexts.
Example application areas include social robots; tangible and tactile interaction; conversations with intelligent (virtual) agents; mobile coaches and multimodal training games, brain-computer interfaces and more.
The list of available topics for the 43rd Conference are listed below. Topics that are no longer available will be crossed out. For more details and the most up-to-date list of topics, always check Canvas.
Sensing technology for detecting food intake
Non-stationary preference learning in online human-robot interaction
Mining knowledge graphs from textbooks
Humor in the wild
Spoken (or multimodal) interaction with information systems
Research infrastructure / research assistant
Stakeholder readiness in spoken interaction or research infrastructure
Various robot-related topics
Interactive media in the health domain
A comparative study of touch feedback in coaching scenarios: Haptic vest vs. embodied social robot in squatting exercises
Detox@Home: prompt engineering techniques to design a virtual agent
Visual language models and gaze tracking in VR art exhibitions
Enhancing content retention through timed LLM conversations in VR art exhibitions
Interventions for tackling down picky eating behaviours in children
Conceptual modelling for representing user information for health apps
Effective time based visualisations for unstructured task planning
Prototyping a conversational agent for children’s play
Telepresence robotics
"Desk chair" robot control
AI-driven personalization in educational dialogues
For further information on the content of this track, you may contact the track chair: Mariët Theune