Overview
This half-day workshop aims to facilitate collaboration among researchers from various sub-disciplines of HCI, bridging the gaps between HCI and adjacent fields such as machine learning (ML), computer vision (CV), natural language processing (NLP), and software engineering. We welcome participants working on research and applications of AI in UI/UX design, GUI agents, multimodal interaction, and adaptive interfaces spanning 2D and XR environments, from both industry and academia. The primary goal is to explore how intelligent interfaces can collaborate with users across modalities, rather than replace them.
Topics
We invite researchers and practitioners to contribute in the following formats to participate in the workshop:
A 4–6 page position paper in the double-column CHI Extended Abstract format (excluding references)
Submissions will undergo a peer-review process, with each reviewed by at least two committee members or organizers. Selection will be based on quality and relevance. Participants should follow the instructions on the website and submit their position papers via email at user.interface.workshop@gmail.com.
Submissions can cover, but are not limited to, the following topics:
GUI Agents as Collaborative Partners Across Interfaces:
How autonomous agents perceive, understand, and manipulate UIs to accomplish user tasks on behalf of users. This topic explores agents' capabilities in navigating complex interfaces, from web applications to mobile apps and operating systems. Examples include agents that execute multi-step workflows across applications (e.g., OpenAI Operator, Anthropic Computer Use), agents that assist novice users through complex interfaces, and agents that provide accessible UI overviews for users with disabilities.
Designing Interfaces for Agent Understanding:
What makes a UI "agent-readable"? This topic investigates how UI design across modalities can support agent interpretation and operation. Key questions include how to represent UI semantics, structure, and affordances in ways that both humans and agents can understand. Examples include semantic UI structures that agents can interpret, accessible designs that benefit both humans and agents, and consistent interaction patterns that enable reliable agent operation.
Human-AI-UI Collaboration Patterns:
In the Human ←→ Agent ←→ UI triangle, agents act as mediators between human intent and interface actions. This topic examines interaction patterns where agents interpret user goals and translate them into UI operations, while UIs must be designed to support both human oversight and agent execution. Key questions include: How do users communicate intent to agents? How do agents demonstrate their understanding of UI state? How can UIs support seamless handoff between human and agent control? Examples include mixed-initiative systems where users can intervene in agent tasks, and UI designs that make agent actions transparent, reversible, and correctable through user demonstration or feedback.
Agents in Spatial and XR Environments:
Unique challenges emerge when agents operate in embodied, spatial, and hybrid interface contexts. This topic explores how agents can understand and interact with 3D/AR/VR interfaces, where UI elements have spatial relationships and context-dependent meanings. Examples include agents that navigate spatial UI layouts in XR, context-aware AR interfaces, and agent-driven assistance in mixed reality tasks.
We will recruit researchers and practitioners in this field as program committee members to review submissions. Accepted papers will optionally be available on the workshop website with the authors' consent. At least one author of each accepted position paper must register for and attend the workshop, either in person or remotely.
Authors of accepted position papers will give a short impulse talk (about 5 minutes) to seed discussions and activities.
Key Dates
Call for participation released: December 15, 2025
Position paper submission deadline: February 10, 2026
Notification of acceptance: March 15, 2026
Workshop date: April 13-17, 2026 (specific date to be announced)