Checkpoint

Enhanced Multi-Sensory Maps Case Study

Credo

“Our first responsibility is to the needs of the people we represent. We aim to serve the broadest range of human diversity. We don’t see problems, we see opportunities to innovate. We believe that before we design anything, the voices and needs of the underserved and marginalized must be central to our design ethos.

Our objective is that we are proud of every project that we undertake, and every project would be exemplary in our portfolios.”


A headshot of Caleb Jones

Caleb Jones

A headshot of Mike Rugo

Mike Rugo

A headshot of Elyse Turton

Elyse Turton

A headshot of Katlin Walsh

Katlin Walsh

Image description: Woman hiking with her child in the forest. She's interacting with the enhanced map kiosk to plan her route and learn about the trail.

Why?

People have more navigation tools at their fingertips than ever before. Traveling from point A to point B can be a difficult process—especially for people with accessibility needs. Public spaces such as nature trails, parks and beaches use permanent 2D map directories that are inaccessible.

Traditional 2D maps can be incomprehensible to people with permanent or temporary impairments, including learning impairments, vision impairments and language barriers.

How?

Checkpoint developed through iterative research, design and user testing, as well as collaboration with peers and wayfinding experts.

The 7 Universal Design Principles are foundational to the project, but focuses on “Flexibility in Use” to serve the broadest range of human diversity, regardless of age or ability. Checkpoint preserving the dignity of people with accessibility needs by offering users choice in methods of use.


Image description: Elderly person using the app on their smartphone to navigate through a forest trail. Their phone is vibrating and playing sounds to provide navigation feedback, as well as displaying and enlarged map.
Image description: A woman walking through the woods, guided by a smartwatch app and its vibrations guiding her through the trails.

What?

Checkpoint is a multi-sensory map kiosk enhancement and checkpoint navigation app. It assists how people, especially visually impaired people. The solution helps visually impaired people by enhancing map directories and wayfinding signs with sound, haptic feedback, textured surfaces, and tactile pavement.

The accompanying Checkpoint app guides new users through a series of wayfinding checkpoints.

Project Clients


The Ontario Government Logo

Government of Ontario

Public spaces run by the Government of Ontario.


The Design Exchange Logo

Design Exchange

Presentation & proposal suited to competition.

The Interaction Design Logo

Myles Bartlett & Bad Access

Remain within the parameters of course material and boundaries.

Our design process began with defining and clarifying our project's definition. For this project, we had special parameters with three different clients.

The aim of project will enhance or modify visual monolithic map/way-finding for tactile, auditory, and/or other sensory interactions. The objective is to make an accommodation for the greatest number of mismatched interactions as possible. One of the main points is to address the lack of accommodation for those who are visually impaired as they are unable to use maps and way-finding that are visual-based.

Problem Space

An image of an outdoor map for a mall, the sign reads "Car Park A". The map is 6 feet tall for prominent viewing.
An image of an indoor mall map, the heading reads "Falkirk Town Centre". The map is 6 feet tall for prominent viewing
An image of a mall map indoors. The map is at waist height on a 45 degree angle for ease of viewing
An image of three phones showing a wayfinding app proposal. The middle image shows a large view of a map, while the outer two images display directions in a list format.

The structures simply aren't accessible to people with visual impairments. How are they supposed to navigate and use these maps?

Key Decision 🗝

There's lots of challenges with wayfinding, but from the onset I proposed these map kiosk-things. I don't even know how to use them half of the time, so I can't imagine how someone can use it if they are visually impaired.

Design Process

An image of the google logo exported using blue colour blindness.

Tritanopia Example: Blue Blindness

Visualization & Prototyping

The designers were asked to research as much as possible about public wayfinding, in addition to mobility and visual impairments that may impact wayfinding for a user. This information was then used to create basic personas for the next stage of creation.

Roadblock ¯\_(ツ)_/¯

We really struggled to connect with our designers here. It was like playing a mental game with ourselves - having to give up control of our project.

Iterative Refinement

A sketch of a smart watch, illustrating accessibility needs.

Having the capability of directing users with existing technology was crucial to the brainstorming process.

A sketch of a screen, an augmented interface is displayed on a wireframe.

The ability to use an augmented interface to personalize the user experience was important to the designers.

A rough sketch of a map, an arrow is drawn from point a to point b to demonstrate way finding.

The directors emphasized the ability to show a basic map as a minimum viable concept for the proposal.

A complicated sketch of a map, demonstrating multiple checkpoints within a desired route.

Showing a more detailed description of directions while also keeping in mind accessibility was brought up as well.

During the beginning of our creative process we began to conceptualize the direction we want to go in. Within the boundaries and parameters of the Design Exchange, we came up with several concepts which included way-finding, accessibility and minimal user engagement.

In order to gather a wider range of concepts, we asked our design team to create eight crazy ideas each. With a total of 32 individual ideas, we could begin converging and start the first iteration.

Key Decision 🗝

At this point in the process we had to sit down with our designers and ask them to iterate on what they had submitted to us. We ended up requesting that they combine the above ideas to create a 'super submission' that included all four elements.

An image of a railing complete with braille on the handrail, the braille describes the view of the city escarpment.

Inspiration

We drew inspiration from an art piece by artist Paolo Puddu called “Follow the Shape.” The art piece incorporates braille text on a hand railing at a popular sight seeing location, enhancing the experience of "sight-seeing" for people with visual impairments.

Puddu decided not to reveal what the braille text reads as blind people cannot see the view before them, sighted people cannot decipher the meaning of those graphemes.

A Breakthrough 💥

During this phase we realized we couldn't reinvent the wheel. There's a reason that people wayfind the same way, so instead of changing it why not enhance it? Real breakthrough moment, especially when we learned about Puddu's "Follow the Shape."

An image of a sketched kiosk with interactive braille included on the screen.

Augmented Kiosk

An augmented interface would be utilized in combination with a web app that would allpw users to use the interface even after moving away from the kiosk.

A green image swatch

Colour Coded Areas

A colour coded system would carry through from kiosks to app interface, creating continuity of design elements.

A sketch of a person standing next to a person in a wheelchair, on the right is a kiosk with a pivot point for the user to alter the angle of the screen.

Adjustable height

Adjustable height settings would be implemented in order to allow for a variety of users and glare settings in case of outdoor use.

An image of a phone, on the screen is a map with map wayfinding icons pointing to locations

Geolocation

The use of geolocation tags with augmented reality would be applied into the web app to serve as wayfinding interactions.

Key Decision 🗝

Our designer group came up with a lot of great crazy 8 sketches and we decided to move forward with a combination of a select few sketches that we liked.

Roadblock ¯\_(ツ)_/¯

Looking back I don't even know how we go to AR and AI? Very cool Star Trek-like features, but we got stuck here. The project was turning into speculative sci-fi and we didn't know how to pivot.

A mockup of a tablet on a table, the image demonstrates alternating height and tilt ability, as well as a speaker for the visually impaired.

Iteration

The first iteration used a few components of previous brainstorm and crazy 8 solutions. The main highlights which we focused on were the touch compatibility for those the accessibility needs. We conceptualized ideas around how to make navigation as fluid as possible, working ways around user engagement and encouraging seamless interactions for both able and non-able participants.

As a result, our prototype demonstrated a key understanding of our strongest considerations but we still felt something as missing. An understanding of our vision. To find this, we looked to visualization and prototyping.

Roadblock ¯\_(ツ)_/¯

Throughout our iterative refinements, the overall vision of our product became more blurred as new variables came into place. I think the greatest obstacle was adapting and applying to new information to our project. It wasn't until out project testing that we had a refined vision for the project's outcome.

Project Testing & Refinement

The proposed concept is a prototype of an interactive kiosk with a corresponding web app using AR and AI for way finding and navigation in spaces like malls, campuses, and airports. This solution seeks to cover a wide range of people with different accessibility needs including mobility impairments, visual impairments, and auditory impairments.

The proposed concept has several key misalignments with the key objectives outlined by the director group that pose a very high risk of invalidating the viability of the concept.

The concept was designed for use in non public spaces (malls, school campuses, airports etc.) instead of public spaces (beaches, trails, parks etc.). Monolithic structures in public spaces contain more constraints compared to non-public spaces as they are often exposed to outdoor environments and not digitized.

Key Issues

The previously proposed design had three core issues that were identified by designers in the testing phase. Based off of their critical findings, the group redesigned the user journey, personas, and key aspects of the project.

Indoor Augmented Reality

Even when applied to indoor spaces, the concept still finds itself misaligned by developing values to the wrong targeted users.

AR is not adopted as a standardized feature in smartphones, so using it requires more set up, making it naturally more appealing to tech enthusiasts instead of the general public.

Internet Connection

For people with accessibility needs that have mobility, visual and/or auditory impairments, their smartphone usage capacity is likely to be limited already

Using a way-finding A.R web app that requires connection to the internet, a healthy batter, a setup process, and a sync with the kiosk data may create even more obstacles for the user without assisting with way-finding.

Extra Steps

For an AI driven kiosk, it’s ineffective when not streamlining skippable steps for people with accessibility needs manually instead of detecting accessibility needs of the user.

For Example: wheelchairs, casts, walking canes or service dogs.

Findings

Because the concept focuses on proposing a novel solution only for people with accessibility needs, it left out the majority of people who are not affected by accessibility needs. Even then, the solution is misaligned in proposing its values to the targeted users by requiring them to use technology that is not standardized on top of cumbersome setup processes.

The design team had brought up suggestions which helped elevate the process by proposing to pivot to a different idea using existing materials, or directly improving on the current concept.

Roadblock ¯\_(ツ)_/¯ / Key Decision 🗝

We had to pivot during one of the later weeks of our project because we came to the conclusion, with the help of our last group of designers, that our initial proposed solution wasn't feasible.

User Flow

A final user flow was proposed based off of the previous rounds of user testing. This highly influenced the design submission mockups that were created.

A user flow of the final submission. The user goes through interacting with a screen, using audio to find their way.

Key Decision 🗝

Pivoting this late into a project worries me, but at this point the alternative was a poor solution. I think the decision to pivot at this time was a success due to the capabilities of our latest design team. After understanding out Key Issues, we were able to adapt accordingly.

A Breakthrough 💥

By the end of this phase, our team had a stronger understanding of our project's goal and vision. Everything was coming together at this point. Our design team pulled through and provided us with great material and insights to work with. Without the dedication of our design team, I wasn't sure if we would have been able to pull this off.

Final Refinement

Solution

The solution is inspired by the idea of “how can we improve the experience of wayfinding in parks and trails for visually impaired people so that they can feel encouraged to enjoy recreational activities in public spaces”.

The group proposed a solution that will help visually-impaired people navigate more effectively in public spaces by improving on existing way-finding monolithic structures. The solution includes a wearable device for the user and a system of monolithic hotspots in public spaces that will help to guide way-finding through sound, haptic feedback, and textured surfaces.

A Breakthrough 💥

With help from our last designer group we were able to successfully pivot from our initial idea and propose and new, more refined idea.

An image of a decision point providing audio cues, tactile pavement, and braille signs to allow for many individuals to find their way.

Decision Points

The proposed project adds touch and sound elements to map directories alongside an app using GPS technology. Adding raised, tactile features will help users understand a map’s contents without relying on visuals.

Direction boards are often placed at crossroads where there are multiple directions to navigate. These boards serve as hotspots that will help users know whether they are on the right track to their destination. To make them more accessible for visually impaired people, these direction boards have a speaker, the direction sign is supported in braille and textured surfaces on the ground lead the user to the post.


Kiosks

When a user touches an element on the map, capacitive touch sensors trigger a sound clip. The clip explains the location and description of the element and can connect to a user's app. These features can be inexpensively implemented by incorporating microcontroller units and graphite paint in existing map kiosks. The capacitive touch enhances the accessibility of current directories without solely relying on touchscreen solutions.

A navigation app on a user's own device improves the user experience of digital map solutions, and reduces the required cost of implementation. This phase originally made use of an NFC and GPS bracelet instead of a bracelet, but using one's own device is less expensive and easier for users. By using their own device, the app can tap into existing accessibility customizations that are already familiar to people.

An image of a kiosk on a traditional map screen. On the left is the legend of the map, featuring braille text and raised symbols for the corresponding map on the right.
A sketch of a bracelet that is wearable by a user finding their way. The bracelet provides haptic feedback using vibrations as well as audio cues with a button to play each sound.

Accessories

A wearable bracelet is specifically designed for navigation and wayfinding using sound and haptics as a form of feedback. The device will have an NFC (Near Field Communication) and signal synchronization technology (Bluetooth / Wifi hotspot) for communicating data with the monolithic structures in public spaces.

The bracelet idea proposed was later changed to support individual's phones and existing wearable devices. This created a more accessible project that allowed users to utilize a cost effective strategy.

A Breakthrough 💥

Seeing this all come together was such a weight off of our shoulders. It finally looked like a proper submission!

Design Submission

Image description: On the left, a rendered 3D image of a wayfinding direction sign with braille. To the right, a map kiosk. Tactile pavement, braille, raised images, and speaker systems are featured on both kiosks.  A silhouette of a woman standing next to a man in a wheelchair sits between both two kiosks.
Image description: Woman hiking with her child in the forest. She's interacting with the enhanced map kiosk to plan her route and learn about the trail.
Image description: Three hikers on a snowy nature trail approaching a direction board sign enhanced with tactile pavement, sound, and braille.
Image description: A woman walking through the woods, guided by a smartwatch app and its vibrations guiding her through the trails.
Image description: Elderly person using the app on their smartphone to navigate through a forest trail. Their phone is vibrating and playing sounds to provide navigation feedback, as well as displaying and enlarged map.

Submission Formats

Speech to Text

Files were subbmitted using speech to text software, submitting a completly spoken word file as part of the submission.

Braille Ready Files

A word document with raw text was submitted, ready for processing in text to braille software.

Fonts & Colours

No italics or captial letters were used in the submisison. All images were tested in colour blindness software, checking that anyone could see the image.

Accessible PDF

A fully accessible PDF was submitted, creating alternate image tags, and tagging each paragraph for text to speech reading order.