This workshop will bring together researchers interested in interactive creative AI to develop artificial intelligence systems that collaborate with humans across the performing arts including music, visual arts, dance and drama. A central issue we will address is how to design AI systems that can meaningfully participate in real-time creative workflows so as to enhance human performance. Unlike traditional generative AI that produces finished creative outputs, interactive AI systems must respond dynamically to human input, adapt to context, and facilitate ongoing human-AI collaboration.
The workshop will focus on the unique technical and methodological challenges that arise when AI systems interact with humans during the creative process. Key questions include:
How can AI respond to the improvised and subjective nature of performance, and do so in real-time?
What interaction paradigms best support human-AI interactive collaboration?
How do we evaluate systems where the novelty of output and improvisation may be valued above correctness?
Therefore, creative AI for live interactive performances differs from traditional AI applications in that the systems must handle the inherently subjective, contextual, and evolving nature of creative work while maintaining responsiveness to human intent. The workshop aims to address this both through sharing idea via presentations, keynote speakers and round table discussions as well as sharing practice through live interactive demos and video show and tell.
Although this workshop will accept submissions from all areas of interactive creative AI, the primary focus will be on systems that facilitate real-time human-AI live performances. Specifically, we are interested in embodied performance exploring how AI can be embodied in a way that it connects to our own bodies. This might involve a wide range of modalities including touch, music and brain-computer interaction, as well as diverse sensors and actuators that might be connected to human or AI bodies (e.g. human-robot interaction) so as to capture data for and deliver stimuli generated by AI. While much recent work has focused on large-scale generative models, there has been less attention to how these models can be effectively integrated into live interactive creative performances and how humans contest control with AI as well as methods for evaluating interactive systems.
The other specific area of emerging interest is the social, legal, and cultural implications of AI. As these systems become more prevalent in creative industries, questions arise about authorship. Therefore, the workshop is interested in both technical approaches to building trustworthy creative AI as well as frameworks for understanding the broader impact on creative communities and practices.
The 1-day workshop will open with a keynote by Steve Benford on AI for live performances in somabotics and embodied creative AI. This will be followed by selected paper presentations (10 minutes presentation + 5 minutes questions) grouped by interaction modalities.
A second keynote from a performing artist who has experience of working live with AI will provide perspective on real-world applications and deployment before lunch. The afternoon will feature a poster session, allowing informal discussion of the remaining submissions.
Following the posters, we will hold a structured group discussion addressing future research directions, ethical considerations, and strategies for involving artists in the research process. The workshop will conclude with sharing practice as well as ideas including an interactive demonstration of live music performed in partnership with AI systems. Workshop participants are invited to demonstrate their work either physically or as a video show and tell.
Paper deadline: 22nd October 2025 AoE
Notifications Sent to Authors: 5th November 2025
Camera-ready: 13th November 2025
Workshop Date: 26th January 2026