Technology empowers all of us, including people with disabilities. According to the recent survey by the WHO there are about 285 million blind people worldwide and a majority of them are over the age of 50 residing in less developed countries and come from low income settings. So there is need to develop low cost, user friendly solutions typically for this population of the Visually Impaired community. Modern developments on navigation, localization, and computer vision technologies provide aides to many transit difficulties for the visually impaired (VI). The VI reply heavily on public transit systems. Based on interview feedback and survey of previous research, it is discovered that navigating to boarding on the platforms is the most challenging part of their commute. We propose to develop a platform boarding assistance for the VI with intuitive actuation haptic feedback as guidance while requiring minimal infrastructure resources such as cellular and Wi-Fi coverage.
Tremendous amount of effort steams from both the industrial and research communities to support the Visually Impaired. So far the work can be broken down into two categories, navigation and guidance. Outdoor navigation using GPS has become routine. Indoor navigation, due to the ease of instrumentation within a confined environment, has archived enough sophistication for indoor route-finding. At the same time, the behaviors of the VI are studied in depth and a multitude of guidance delivery mechanisms have been researched, such as verbal, either with in-ear or open-ear headphones or broadcast speakers, portable Braille device, modulation of cellphone vibrations, or some combination thereof to deliver multi-sensory notifications.
One particular organization, Wayfindr, has created an open standard through audio based navigation system. Its standard covers almost every aspect of way finding, with a focus on constructing the most efficient verbal notification.Emerging indoor navigation technologies such as Bluetooth Low Energy (BLE) Beacons hold the key to opening up the world for vision impaired people. However, in order to achieve the greatest impact globally, there is a pressing need to develop a consistent standard to be implemented across way finding systems.And that is what this project is more focused on. Its emphasis on auditory feedback which aggravates the downside of this solution.
For systems with haptic feedback, smart canes and wearable devices such as smart vest or smart headband are being developed. A lot of focus has been on obstacle detection and avoidance, making use of ultrasound sensors and haptic engines. The picture on the left shows an smart cane with ultrasound sensors. On the left, we have a smart vest made from electro-active polymers that maps locations of obstacles into vibration regions on the vest .
We are lucky enough to have met one of our fellow bruins Brandon Shin, who lost his vision in his childhood. It was only through constant in-depth communication with him that we are starting to understand what they are capable of, how they handle things we sighted ones take for granted, and what they really need, which in our opinion is beyond technological innovations (except for fully recovering their visions maybe). We are in total awe when he performed echo-location to distinguish grass land and clusters of bushes. The VI community deserves our understanding and respect.
With the understanding that not all VI population can performance echo-location, yet that almost all are good with canes (one of a few things they always carry with them), and that the support for the VI depends heavily on social and financial resources, our system differentiates itself in the following aspects:
Boarding Assistance for the Visually Impaired intuitive haptic feedback
Coarse localization using the BLE beacons
The lack of cellular and Wi-Fi access and the drive for lower cost calls for BLE beacon based positioning. A simple proximity-based positioning is always viable as every BLE beacon broadcasts a particular address which can be associated with location of interest such as a bus sign. In addition, with some processing of RSSI data, approximate trilateration can be achieved. We are going to use a set of BLE beacons to coarsely localize the visually impaired and guide him towards the boarding entrance of the train.
Accurate alignment using the UWB radios
The entrance of the transport will be equipped with two UWB radio modules (anchors), one at each side. There is one additional UWB radio that is embedded in the cane. Two way ranging is used here to measure the distance from the tag to each anchor and a precision of 10cm can be achieved based on data from Decawave, vendor of the radio module. With one more anchor, accurate trilateration can be carried out to locate the user in more complex scenarios
Haptic actuation and feedback
The directional instruction is issued by a servo motor equipped with a rotation dial with haptic driver. User will have the option to have the instruction issued adaptively or only upon request. The requesting, which is a specific motion of the white cane, is monitored by the IMU mounted on the cane. The rotation of the motor is coupled with specific haptic signatures for different commands and different stages of navigation. In addition, the timing synchronization of beacon detection and notification delivery, though at the human scale, is still critical so as not to create any confusion for the subjects.
Assumption 1: Our proposal assumes that the vehicle will stop within a predetermined area even though the exact location may vary. Such a constrained uncertainty applies to the clear majority of public transit systems. What we are not considering, however, is when such a predetermined area goes beyond the capability of beacon broadcast, which do cover some extended distance and a few more beacons do not cost much. When the vehicle park arbitrarily, a scenario where the location of the subject and the location of the vehicle cannot be calculated directly by one central processing unit. In this scenario, the location of the user can be associated with beacons nearby while the location of the vehicle is known by its on-board instrument accurately with respect to a possibly different set of beacons. The communication between the two devices must then be carried out, which necessitates device to device protocols such LTE Direct or networked solutions involving cloud processing.
Assumption 2: One other scenario that is excluded to some degree for this project is crowd monitoring and on-platform route-finding. The justifications are 1) The white cane provides general help for the VI to keep distance from other people. 2) Benign behaviors are assumed for the sighted population, that is, when the sighted see a VI person coming, they will react and yield the right of way. 3) Based on our research, crowd monitoring and on-platform routing-finding can be implemented with more infrastructure support such as camera and Wi-Fi. The processing of the collected data is desired to be real-time but the amount of the processing most likely requires centralized computing. We are again faced with the desire to facilitate device to device communication but with this scenario we need to consider the latency of cloud connectivity as well.