The SensorTag arrives pre-installed with programs that are intended to interact with smartphone apps. In this connectable mode, the SensorTag would only advertise for two minutes if no smartphone attempts connection and activates the UI. At the beginning as we read about the over-the-air-download (OAD) programming capability of the SensorTag, we thought we could just reconfigure the tag by changing the discoverable mode, as all we needed are device addresses and their RSSI.
#define DEFAULT_DISCOVERABLE_MODE GAP_ADTYPE_FLAGS_LIMITED // change to GAP_ADTYPE_FLAGS_GENERAL
However this approach did not work. One possible reason is that we do not have the exact software source code the SensorTags is running from. The application software must match with the BLE stack and the stack version can be pretty old. In order to reprogram both the application and stack we had to use the DevPack and program the SensorTag using the JTAG interface. When the firmware image is sent to the SensorTag, it is first stored in external SPI flash memory. Once the entire image is sent, the SensorTag will reboot and install the new image from the external flash memory.
The final application and stack source code are nearly intact port of the simpleBLEBroadcaster example from the simplelink ble_c26xx_2_01_01_44627 version of stack installation. Among all the stack releases, only this version seems to support broadcaster on the SensorTag. Texas Instrument marketed several development platforms for the CC2650 wireless MCU. With newer stack release, examples seem to target more towards premium kits such as the launchpad and evaluation board.
One additional challenge particular with the SensorTag is that programming is not robust enough. Sometimes a tag would need repeated JTAG programming of the same image in order to get updated. A few SensorTags appeared bricked right after programming and were "resurrected" by a "jump" connection from the DevPack. Those details were not discovered at the beginning and we did take a long detour, such as delving into the every libraries and the software protocol for explanation.
At the PU side, BLE detection and message parsing is pretty straightforward with BlueZ on linux. With the newest BlueZ and all its dependency installed, BLE detection is just a command away:
$ sudo hcitool lescan --duplicate
The '--duplicate' option enables continuously scanning for BLE devices and with the hcidump utilities, every bits of information such as bdaddr and RSSI are streamed in
$ sudo hcidump -a
The RSSI information is only an relative measure and it could vary depending on the BLE receiver hardware. It was discovered that the Bluetooth receiver on the Raspberry produces more sensible results than a StarTech BLE dongle on the BBB. The following two plots show the RSSI time series taken on both platforms when the tag is moved away and then back (for the Pi, the same movement is repeated and thus the two "valleys"). Pi has much better dynamic range than the BBB. The notch towards the end on the Pi is intentional - the tag stayed at the same location while the user holding it did a 180 degree turn - to study the attenuation caused by human body.
It is also apparent from the time series chart that RSSI readings are very noisy, often oscillates between two values even when the tags are programmed with a nominal TX power of 0dBm. Filtering must be applied before using the data. Fortunately simple averaging do yield reasonable outcome.
The DWM1000 module is just the radio part. It needs a microcontroller in order to build a system around it. For the hardware, we adopted the design by Wayne Holder while for the software, we use a GitHub library for Arduino.
Ideally the system should work, and it does, but with poor consistency. A unit (DWM1000 + Arduino Pro Mini) would work for 10 minutes then all of a sudden drop out. A unit would work for a whole week and the next day it would refuse to respond. Yes we did have a few cold solder joints but after identifying and fixing all assembly work the inconsistency persisted. Naturally we went into software to dig for possible collisions and package corruption mechanism first, but to no avail.
Then we purchased a few more DWM1000 modules... situation did not improve.
Then we replaced all Pro Mini board with Pro Micro boards... nothing if not worse.
Then out of frustration, we started to monitor signals trace by trace. We found that the non-responsive modules never got out of reset and the suddenly-dead ones were reset randomly. We had read about issues with the RSTn connections from GitHub, and we were using the supposed workaround. The problem is that the DMW1000 user manual specifies that the RSTn pin cannot be pulled up high and therefore the library sets the following after driving the RSTn pin low:
pinMode(_rst, INPUT); //some suggested pinMode(_rst, INPUT_PULLUP) as a workaround
which is probably the right thing to do if the microcontroller is directly connected to the module. However, in most of the cases we saw from the GitHub discussion, a level converter from 5V to 3.3V sits between the microcontroller and the module. Since the RSTn pin is a open-drain design, any level converter with high input should be able to pull-up the RSTn pin and bring the module out of reset. The statement from the user manual was only trying to say that the RSTn pin must not be hardwired to a pull up source.
After modifying the code for resetting the module to,
pinMode(_rst, OUTPUT);
digitalWrite(_rst, HIGH);
the consistency improved drastically. Sometimes power cycling and microcontroller reset is still needed to bring up the Decawave, but we were happy that they were up and running. Efforts are still being made to see if changing the setup completely (such as using a different level converter or none at all or a more power microcontroller) will improve the robustness.
The examples DW1000RangingTAG and DW1000RangingANCHOR from GitHub already provide enough functionality for our ranging application. The tag is connected to the Raspberry Pi directly through USB and the anchor addresses and distances are sent to the Pi through USB UART.
The distance reading are fairly precise when the distance is great than 0.6m, which is sufficient for our application. Below 0.5m, the resolution and software delay starts to manifest and the distance is not accurate. A three-sample average with exception handling ('not found' due to package drop) is in place to ensure smooth and timely distance update.
The 9DOF absolute orientation sensor gives pretty smooth and consistent heading readings even when roll and pitch are fairly large values. We also experimented with converting quaternion readings to heading value, which is said to give a more accurate absolute heading value and found that the fusion algorithm in the BNO055 does a pretty good job for the heading calculation in the 9DOF mode. We decided to use sensor heading value directly for orientation inference and direction orientation.
On the other hand, feature extraction for our designated cane gesture requires more study. The most common motion for white cane usage is the sweeping action for detecting obstacles and stairs. We tried to draw as clear as a boundary against the sweeping motion by assigning an accelerated forward lifting motion for recognition yet we do NOT want it to be confused with another common scenario - a VI person moving from a seated position to a standing position could also induce a casual forward motion to the cane. The threshold had to be carefully set to distinguish between intentional and unintentional motions.
We first concluded that a combined constraints on linear acceleration in all three dimensions provides the best detection experience. As acceleration changes instantaneously, motion recognition alone sets the rate of IMU monitor thread. Later in the implementation stage, we found out that accelerometer is the most onerous sensor to calibrate. Angular acceleration which is based on the same physics but much easier to calibrate was used in the final implementation.
x,y,z = bno.read_gyroscope()
if (abs(y)>1) and (abs(x)<0.5) and (abs(z)<0.5):
detected = True
The most important thing for using the servo is to give clear instructions about turning left or right, as we human can only have a rough estimate of angular movement. However internally inside the processing unit, the angles must be calculated with good precision for continuous guidance. The lesser requirement on servo feedback allowed the servo to be used without the feedback and also okayed its limited rotation range of 135 degree. The servo center position is designed to be always aligned with the IMU heading, which is done by physically aligning the center axis along the IMU axis and use the center position PWM for resetting the servo. The call for left or right turn is determined using a simple and effective change-of-base algorithm (due to physical mounting constraints, 0 degree heading is due West for the prototype, instead of due North)
The code for the haptic driver is a python port of Arduino library for the DRV2605L
Naturally, there have to be several threads running to get the entire system working as a whole. The main background threads are:
Most of the time these threads are simply doing string parsing and dumping sensor readings into a customized ring buffer that can be read by the main process. The seek request and the DW in range event both asynchronous in nature and must be monitor carefully and promptly, which sets the main loop to have a period of 1 sec. However, once seek request occurs, the scheduling period can be relatively long as the user are expected to move at a slow walking speed.
The seek request and the DW-in-range event are the conditions for starting the actual guidance routine. They are the output of their respective background monitoring threads yet must be accessible in real time by other threads and the main process. This is again handled by the ring buffer, which is similar to the synchronization queue concept.
The guidance routine in the prototyping stage assumes a simple 2D station map.
Without doing a copy-and-paste of the source code, the flow chart for the final integration is shown next. The integration is done in Python.