Workshop on Novel Input Devices and Interaction Techniques – NIDIT at IEEE VR 2025, Date: TBD, Saint-Malo, France
Virtual Reality (VR) has become a mainstream technology. Recent advances in VR display technologies have led to high-resolution, ergonomic, and – critically – low-cost head-mounted displays (HMDs). Advances in commercial input devices and interaction techniques have arguably not kept pace with these advances in display systems. For instance, most HMDs include tracked input devices, but these “controllers” are still fairly similar to the earliest examples of 3D controllers used in the VR systems of the 1980s. Interaction in commercial VR systems has similarly lagged; despite many advances in 3D interaction in the past three decades of VR research, interaction in commercial systems largely relies on classical techniques like the virtual hand or ray-casting.
Topics
This half-day workshop will bring together researchers and industry practitioners to discuss and experience the future of input devices for Virtual/Augmented/Mixed/Extended Reality and 3D User Interfaces, and help chart a course for the future of 3D interaction techniques. We invite authors to submit 4-6 page papers on any of the following topics:
§ Form factors and ergonomics of input devices,
§ Hardware design and prototyping,
§ Mapping of input to varying degrees of freedom,
§ Haptic/tactile feedback,
§ Novel input devices,
§ Repurposing of existing devices (e.g., smartphones, tablets) for immersive contexts,
§ Tracked passive or custom props,
§ Novel interaction techniques supported by custom devices,
§ User studies evaluating the above topics.
Related but unlisted topics are also welcome. In addition to a presentation at the workshop, authors of all accepted submissions are strongly encouraged to demonstrate their novel input device and interaction techniques in an interactive demo format following their presentation.
Submission Information
NIDIT 2025 will accept 4-6 pages of short papers (references included).
Papers should be submitted via PCS : https://new.precisionconference.com/
Submissions must be anonymized and in PDF, using the VGTC format: https://tc.computer.org/vgtc/publications/conference/
All submissions will be reviewed by experts in the areas listed above. At least one author of each accepted submission must register for the workshop and at least one day of the IEEE VR 2020 conference.
IMPORTANT NOTE: Authors of accepted papers are expected to give a 10-minute presentation at the workshop and are strongly encouraged to subsequently give a hands-on demonstration of their research.
In cases where demonstrations are not possible, a video may be provided. The workshop organizers will be able to provide limited quantities of standard equipment (e.g., head-mounted displays, controllers) to help authors demonstrate their work. Authors of accepted submissions should contact the organizers early to determine what is available. Proceedings will be submitted for inclusion in the IEEE Xplore Library. We will also host the papers on the NIDIT website.
Important Dates
▪ IEEE VR Conference Papers Author Notification: December 10, 2024, AoE
▪ NIDIT Submission Deadline: January 7, 2025, AoE
▪ NIDIT Notification Deadline [UPDATED]: January 9, 2025, AoE January 13, 2025, AoE
▪ NIDIT Camera-Ready Deadline [UPDATED]: January 14, 2025, AoE January 17, 2025, AoE
Organizers:
Mayra D. Barrera Machuca, mbarrera@dal.ca
Prashant Rawat, prashant.rawat@dal.ca
Kristen Grinyer, kristengrinyer@cmail.carleton.ca
Amy Banic, abanic@uwyo.edu
Anil Ufuk Batmaz, ufuk.batmaz@concordia.ca
Francisco R. Ortega, fortega@colostate.edu
Robert J. Teather, rob.teather@carleton.ca
Wolfgang Stuerzlinger, w.s@sfu.ca
Contact:
Please send any questions to Mayra Donaji Barrera Machuca (mbarrera@dal.ca).
Title: Eye-Hand Symbiosis: Unified Eye and Hand-based Interfaces for HCI
Abstract: Although computer control has been traditionally designed for our hands, recent developments in XR show promise in new interaction paradigms. In this talk, I present my vision of 'Eye-Hand Symbiosis'—a future where traditional hand-based computer interfaces are enhanced by gaze input. For over a decade, my collaborators and I have explored eye-hand interaction across diverse systems, from touchscreens and pens to 3D UIs. I will showcase key examples, discuss broader considerations for input theory, taxonomies, and design principles, and conclude with reflections on future directions for this paradigm.
Bio: Ken Pfeuffer is an Associate Professor at Aarhus University specializing in Human-Computer Interaction (HCI), with a focus on virtual (VR) and augmented reality (AR) as well as eye-tracking technologies. His research explores new ways of interacting with digital information, particularly in the context of 3D and multimodal user interfaces.
He completed his PhD at Lancaster University and a postdoctoral fellowship at Bundeswehr University, with research internships at Microsoft and Google. His work has received honorable mention awards at conferences such as UIST, SUI, and CHI, recognizing its contribution to the field. His research explores interaction techniques like "Gaze + Pinch," similar to those now appearing in AR products by companies like Meta, Apple, and Google.
#1001 MUHI - A Multi-Use Haptic Interface that enables Haptic Interactive Distractors for Redirected Walking
Fabian Rücker, Torben Storch, Eike Langbehn
#1002 Does Perspective Matter? Understanding the Role of Viewpoints on User Performance in 3D Sketching
Jialin Zhang, Mayra Donaji Barrera Machuca
#1003 Interactive Machine Learning for Movement Interaction in VR
Tom Lawrence, Tianyuan Zhang, Clarice Hilton, Marco Fyfe Pietro Gillies
#1004 A Novel Bare-Handed Manipulation Technique for Distant Objects in Virtual Reality
Di (Bill) Zhao, Wolfgang Stuerzlinger
#1005 Logitech MX Ink: Extending the Meta Quest Platform with a 6DoF Stylus
Aidan Kehoe, Mario Gutierrez, Yena Ahn, Vadim Kogan
#1006 An Approach for Effective CPR Trainings in Virtual Reality with Multimodal Feedback
Suyash Aditya Lal, Chiranjoy Chattopadhyay, Rahul Kumar Ray
#1007 A Haptic Device for Tennis Simulation: Dual-Flywheel System for Rendering Virtual Impact
Allyson E Chen, Xuan Gedney, Jasmine Roberts