When using touchscreen-based navigation, with and without keyboard attachments, Blind and Low Vision (BLV) individuals experience challenges while navigating interfaces with large regions of negative space (that sighted individuals would perceive through graphically displayed elements). In order to access content, BLV users can either linearly swipe (this is the equivalent of tabbing through on a touchscreen when a keyboard attachment is not provided) through all of the content using a screen reader, press random sections of the interface (hoping to select the specific piece of information they are looking for), or recall the specific gesture/shortcut to jump to that specific function. Additionally, BLV users experience difficulties when trying to develop a cohesive interpretation of the spatial-topological relations of a given interface, required to effectively navigate it, in comparison to a desktop interface that has keyboard control.
To address these challenges, we sought to “translate” the types of spatial-topological properties found in a Graphical User Interface (GUI) to a medium perceivable by BLV individuals through touch. While future touch-based access to spatial-topological properties could be afforded in several ways, such as a refreshable haptic display (Jang et al., 2016), we developed a paper-based tactile sheet representing the spatial relations among elements to test and demonstrate what should be afforded by this recommended paradigm. This sheet can be placed on a touchscreen and the user may interact with content through cut-outs on the sheet. BLV individuals would then feel the interface with their hands and perceive where elements are in reference to each other, aiding their overall understanding of an interface’s layout.
This tactile sheet translates the affordances of spatial relations between elements on an interface found in GUIs into a tangible representation. This translation makes these relations perceivable by BLV individuals, and in turn allows users to more easily forage for information (Pirolli and Card, 1999). We call this emerging idea a haptic-spatial foraging paradigm, as through haptics, it provides spatial information to more easily forage for information.
Zoom Platform on an iPad, with a Tactile Sheet
In order to be on boarded into new interfaces, or generally to achieve a cohesive mental model of the interface, providing an overview is necessary. The possibility to skim content and generate an overall impression of the layout of UI elements in relation to one another is an affordance given by a GUI to sighted individuals. This is not provided through the linear delivery of content through a screen reader.
Through the tactile sheet, BLV users could get an overall mental model of the arrangement of the interface very easily. Especially when using two hands, different content can be contrasted in scale and relative position.
Provide a way to skim content in an interface. It may be done through a tactile representation where participants may quickly feel the layout of the interface.
Avoid attempting to provide an overview purely with alternative text. This often results in overly verbose descriptions that remain difficult to compile.
Provide multiple mediums and formats to access the same content (such as audio, video, etc…)
This is necessary, as even in situations in which an accessible format is provided, such as content through a screen reader, it may remain insufficient. As pointed out in the barriers and pain points section, when swiping/tabbing through the interface, participants pointed out that this is not suitable for online meetings as it interferes with the spoken auditory content given in the meeting. It must therefore be provided through other means.
A similar issue occurs in a tactile sheet that communicates important information through Braille. The braille format may help certain participants avoid overusing auditory feedback, yet at the same time, Braille literacy is very low.
This can be accomplished via cross sensory representations such as a refreshable haptic displays, a tactile sheet, spatial audio, alt text, etc… In this context of translating information into a different sensory modality, redundant information is not a negative but a positive (Doore et. al, 2023).
Do not only rely on one accessible format. For example, Braille on its own is not sufficient for conveying information in an accessible manner. Do not provide cues in only one sensory modality, for example only providing audio.
If there are two separate pieces of information in an interface, differentiate them explicitly. This is often done visually through approaches such as altering of colours, fonts, shapes, sizes, position, etc, though not as often applied to auditory or tactile interfaces.
In one instance of the tactile map interfaces, there were two objects that touched each other in the space. As such they visually had no gap between them and when making the tactile translation of the map, we cut them so they shared the same space. Participants often did not expect there were two different types of content as it haptically seemed to all be the same, there was no separation between the two pieces of content on the overlay, and as such they were not distinguishable. In later versions we added a small separation ebtween the two pieces which aided the distinguisheability of content.
Elements should have their own perceptual characteristics in order to be more easily differentiated from other elements. If it is a tactile representation, this can be done through providing distinct shapes, increasing space between figures, and other tactics.
For tactile sheets, avoid making the openings so small that a finger can no longer press the content. Add tactile space between cutouts to convey different pieces of information.
There is often an over-reliance on linguistic auditory feedback for Blind and Low Vision (BLV) users of digital products and services. We found providing tactile feedback may alleviate this, as well as the over-verbosity of the screen reader when describing the spatial topological relations between elements.
Additionally, it reduces the cognitive load of remembering the keyboard shortcuts or specific gestures, as users are able to access the required information using the spatial relations between objects rather than direct access using shortcuts.
Overall we found three uses for this paradigm and developed features accordingly:
As a tool for overviewing layouts of an interface - When encountering an interface, sighted individuals are able to quickly overview the interface visually to determine its structure and layout. A tactile representation through a sheet was used so that participants could overview an interface intuitively. This is quite helpful when becoming acquainted with a new interface, as a mental image of its properties is more easily achieved.
Navigation of Virtual spaces in Regular Use - The sheet can be put on top of the touchscreen of an existing interface. Through the sheet’s cutouts, users may directly perceive a representation of an interface, and content directly as users no longer have to navigate through swiping/tabbing or by tapping around on the screen. They also no longer having to keep in mind where everything is in relation to one another but can rather perceive it in real time. Additionally, users can employ both hands to feel where everything is more quickly, this allows for faster absorption of the layout (McGookin et. al, 2010).