Workshop on Novel Input Devices and Interaction Techniques – NIDIT at IEEE VR 2026, Date: March 22nd, 2026, Daegu, Korea
Virtual, Augmented, and Extended Reality (VR/AR/XR) technologies have rapidly evolved from experimental prototypes into mainstream platforms for research, industry, and entertainment. Recent advances in display technologies have led to high-resolution, ergonomic, and – critically – low-cost head-mounted displays (HMDs). Yet, advances in commercial input devices and interaction techniques have arguably not kept pace with improvements in display systems. For instance, many commercial systems still depend on input methods that have changed little since the early days of VR. Bridging this gap requires rethinking how humans communicate with immersive systems through new input devices and interaction techniques. The NIDIT 2026 Workshop calls on the community to shape this next generation of input technologies and interaction methods.
Topics
This half-day workshop unites researchers and industry experts to define the next generation of input and interaction technologies for Virtual/Augmented/Mixed/Extended Reality and 3D User Interfaces. We invite authors to submit 4–6-page papers on any of the following topics:
§ Form factors and ergonomics of novel input devices,
§ Hardware design, rapid prototyping, and fabrication for XR,
§ Mapping of input to varying degrees of freedom and body tracking,
§ AI-driven and adaptive input recognition (LLMs, multimodal fusion),
§ Eye-gaze, EMG, EEG, and bio-sensing-based interaction,
§ Repurposing existing devices (e.g., smartphones, wearables, tablets) for immersive contexts,
§ Sustainable and accessible input design approaches,
§ User studies evaluating usability, learning, and cognition in novel interaction techniques,
§ Integration of multimodal feedback and co-adaptive systems.
§ Interaction frameworks supporting embodied cognition and spatial reasoning
Related but unlisted topics are also welcome. In addition to a presentation at the workshop, authors of all accepted submissions are strongly encouraged to demonstrate their novel input device and interaction techniques in an interactive demo format following their presentation.
Submission Information
NIDIT 2026 will accept short papers that are 4-6 pages long (references included).
Papers should be submitted via PCS : https://new.precisionconference.com/
Submissions must be anonymized and in PDF, using the VGTC format: https://tc.computer.org/vgtc/publications/conference/
All submissions will be reviewed by experts in the areas listed above. At least one author of each accepted submission must register for the workshop and attend at least one day of the IEEE VR 2026 conference.
IMPORTANT NOTE: Authors of accepted papers are expected to give a 10-minute presentation at the workshop and are strongly encouraged to subsequently give a hands-on demonstration of their research.
In cases where demonstrations are not possible, a video may be provided. The workshop organizers will be able to provide limited quantities of standard equipment (e.g., head-mounted displays, controllers) to help authors demonstrate their work. Authors of accepted submissions should contact the organizers early to determine the feasibility of a demonstration. Proceedings will be submitted for inclusion in the IEEE Xplore Library. We will also host the papers on the NIDIT website.
Important Dates
▪ IEEE VR Conference Papers Author Initial Notification: December 12, 2025, AoE
▪ NIDIT Submission Deadline [UPDATED]: January 15, 2026, AoE January 17, 2026, AoE
▪ NIDIT Notification Deadline: January 19, 2026, AoE
▪ NIDIT Camera-Ready: January 24, 2026, AoE
Organizers:
Prashant Rawat, prashant.rawat@dal.ca
Mohammad Raihanul Bashar, mohammadraihanul.bashar@mail.concordia.ca
Kristen Grinyer, kristengrinyer@cmail.carleton.ca
Mayra D. Barrera Machuca, mbarrera@ucalgary.ca
Anil Ufuk Batmaz, ufuk.batmaz@concordia.ca
Francisco R. Ortega, fortega@colostate.edu
Wolfgang Stuerzlinger, w.s@sfu.ca
Contact:
Please send any questions to Prashant Rawat, (prashant.rawat@dal.ca).
Title: The New Reality of Extended Reality: Empirical Evaluation of Interaction in XR
Abstract: Extended reality (XR), a catch-all term for virtual reality (VR), mixed reality (MR) and augmented reality (AR) is popular again with the release of low-cost and effective consumer-grade head-mounted displays such as the Meta Quest. The longstanding dream of VR has users interacting with virtual objects as naturally as real ones. In practice, despite technological advances, numerous technical and human factors make this difficult. Modern VR interaction continues to employ naturally-inspired interaction techniques that have changed little since their introduction in the late 80s. Similarly, cybersickness and the lack of tactile feedback when interacting with virtual objects are well-known to limit the effectiveness of VR systems, yet these issues persist today. In this talk, I will discuss my research addressing these three interrelated areas of virtual reality interaction. I will first describe my studies comparing 3D selection interfaces between 3D and desktop systems, and my work in extending a standardized methodology to support fair and direct comparison between these two different modalities. I will then discuss my research group's recent work employing this standardized methodology for evaluating novel 3D selection methods, as well as other projects aimed at enhancing the usability of VR systems through evaluating the effectiveness of cybersickness reduction techniques and novel approaches to VR haptics that employ shape-changing devices and perceptual illusions. I will close by discussing future directions for this work on both improving usability of, and equitable access to, VR technology.
Biography: Robert J. Teather is a Senior Lecturer of Human-Centred Computing at Monash University where he studies several interrelated areas including interaction techniques and input devices, especially when applied to 3D user interfaces for virtual reality. He holds a PhD and MSc in Computer Science (York University, Canada), as well as a BSc in Computer Science (Brock University, Canada). Prior to joining Monash, he was an Associate Professor in the School of Information Technology at Carleton University, Canada, where he served as the School Director from 2022 - 2025. His PhD work focused on developing standardized methods for the empirical comparison of input devices for 3D interaction – primarily to compare mouse and 3D tracker-based input. To this end, Dr. Teather has established himself as an expert in comparing drastically different input devices and interaction techniques for common fundamental interaction tasks in VR (e.g., target selection), across varying system configurations (e.g., display properties such as stereo graphics, or system properties such as latency). His research is supported by Natural Sciences and Engineering Research Council of Canada (NSERC) and the Canada Foundation for Innovation. He has also served in lead conference organization roles (e.g., general chair, technical program chair) in events including the IEEE Conference on Virtual Reality & 3D User Interfaces, ACM Virtual Reality Software and Technology, and the ACM Symposium on Spatial User Interaction.
#7140 Use the force - XR text input via sensitive force modulation
Authors: Fabian Rücker, Robin Horst, Torben Storch, Arjan Kuijper, Martin Weier
#4209 Cezve: A Virtual Turkish Coffee Pot Interaction Technique for 3D Target Selection in Virtual Reality
Authors: Rumeysa Turkmen, Anil Ufuk Batmaz
#4504 A User-Friendly Accurate Bare-Handed Manipulation Technique for Distant Objects in Virtual Reality
Authors: Di Zhao, Wolfgang Stuerzlinger
#8173 StrainGain: Neck Strain Amplifies Head Pointing in VR
Authors: Guanlin Li, Florian Weidner, Anam Ahmad Khan, Haopeng Wang, Jinghui Hu, Hans Gellersen
#7945 Vibrotactile Mass Cues for Planetary Interaction in VR
Authors: Md Asif Bin Karim, Mohammad Jahed Murad Sunny, Aryabrata Basu
#9957 Investigating Adaptive Hand Visibilities for Accurate 3D User Interactions in Augmented Reality
Authors: Rumeysa Turkmen, Nour Hatira, Robert J Teather, Marta Kersten-Oertel, Wolfgang Stuerzlinger, Anil Ufuk Batmaz
#3498 Contextual Recovery: Guiding Hand Tracking Failures Recovery in Mixed Reality via VLM Reasoning
Authors: Yi Zou, Ziming Li, Hai-Ning Liang, Zhiming Hu
#9091 I like to Move It, Move It - Non-Intrusive Full Body Tracking for CAVEs
Authors: Elisabeth Mayer, Thomas Odaker, Dieter August Kranzlmüller
#3427 Hands vs. Controllers With and Without Haptics in VR Object Sorting: Performance, Cognitive Load, and Sense of Agency
Authors: Rudra Krishna, Nihar Sabnis, Chiranjoy Chattopadhyay, Rahul Kumar Ray
#3529 Exploring and Evaluating Multimodal Interaction Techniques for Reducing Fatigue in Long-Duration VR
Authors: Upulanka Premasiri, Stephan Lukosch, Robert W. Lindeman