Wheeler: Multi-Wheel Input for Non-Visual Navigation
(UIST'24)
Problem:
Blind users often face significant challenges navigating modern graphical user interfaces (GUIs), especially when elements are deeply nested, unlabeled, or visually grouped without accessible structure. Traditional screen readers and keyboard shortcuts offer only linear and slow navigation, making it difficult to efficiently interact with professional tools like Excel or perform remote tech support. These limitations increase reliance on sighted assistance and restrict digital autonomy.
Solution:
We designed and built Wheeler (see Fig. 1), a mouse-shaped device with three independently rotatable wheels that each control a distinct axis or level of navigation. Wheeler operates in two primary modes:
H-Nav Mode: Hierarchical navigation (e.g., window → panel → element), where the three wheels are assigned to the first three levels of app hierarchy (see Fig. 2).
2D-Nav Mode: 2D movements across interfaces, with the first two wheels controlling the cursor movements across the horizontal and vertical axes, respectively, and the third wheel controlling the cursor speed (see Fig. 5).
The device uses rotary encoders and haptic/audio feedback, and was iteratively co-designed with blind users through participatory sessions. We conducted:
Two controlled lab studies with 12 blind participants.
A 6-month diary study with a blind co-author using Wheeler in real-world scenarios.
Outcome:
In lab evaluations, Wheeler reduced navigation time by up to 40% in hierarchical tasks compared to screen reader + keyboard use.
Users successfully accessed unlabeled or hidden UI elements (e.g., sliders, tooltips, aria-hidden items).
The diary study demonstrated real-world utility in productivity (Microsoft Office), web browsing, and remote collaboration scenarios.
Users reported improved agency, lower frustration, and quick learnability.
Impact:
Wheeler demonstrates a novel input paradigm that enhances non-visual interaction through rotary, parallel navigation. Its contributions include:
Enabling faster, more structured access to complex app hierarchies.
Supporting interaction with otherwise inaccessible UI components.
Offering a low-cost, augmentative tool that integrates well with screen readers.
Receiving a Best Paper Honorable Mention at UIST 2024, validating its research and design contributions to accessibility, HCI, and assistive technologies.