This a log of Connor Forney's weekly status updates for Team B2: 'Sole Mate'; Spring 18-500 Capstone.
May 3, 2018
This week I implemented an algorithm for detecting the pronation/supination of each step in real time and have added that algorithm to the app. The app now lets the user know both what their last step was considered (pronation/middle/supination) and what their last 50 steps average out to for the same classifications. Unfortunately, the algorithm doesn't have quite the accuracy I would've liked, especially when trying to distinguish Pronation from middle strikes, but I hope to improve the accuracy before the final demo.
The app was also updated to increase throughput of Bluetooth messages it can handle at once, at the cost of potentially missing an occasional line. Since we have pretty high sample rates already, missing a line occasionally shouldn't cause any notable skew to the data, and the additional throughput is needed to handle both IR and IMU data being sent simultaneously (an IR data line is prefixed with an "R" and an IMU data line is prefixed with an "M" to break the data into two different tables once transmitted to the app).
We also had our Final Presentation on Wednesday, which I personally think went pretty well overall.
Next/Final steps:
April 26, 2018
I ported all the MATLab algorithms for detecting the steps to the app, so the app could count the number of steps taken and store the beginning and end of each step so that the data between can be easily analyzed using further algorithms to extract the necessary features. The algorithms were also tweaked during this transfer to the app to account for the change in data output to millimeters.
I also made changes to the data sending python script running on the Raspberry Pi Zero W so that it now loops endlessly and can handle the phone disconnecting and reconnecting. The script was also changed so that more data can be cumulatively stored up as the script is waiting for the phone to be ready to receive more data, then all the accumulated data can be sent at once to the phone for feature detection there.
I have also made good progress toward an algorithm that analyzes each step for whether that step indicates the runner is pronating, supinating, or landing correctly.
Next Steps:
April 18, 2018
I created a test app to figure out exactly how Android does storage of files, and figured out that any app can save and load data to private internal storage, these files just aren't visible to the user outside of the app. So, I built a small app that allows the user to generate new data sets, load previous data sets, and delete data sets too all in-app. This prove the ability to save the sensor data for the actual Sole Mate App, and I plan to port the code from the test app to the real app soon.
With the data Nihar collected on Sunday, Reed and I were able to work together to develop some algorithms in MATLab for detecting what data constitutes a step. First we average the IR sensor data with a window of 25 samples to eliminate the high frequency noise. We chose 25 samples because it allows the actual steps to resemble a sinusoid after being averaged. Then, we compute the "derivative" of the averaged data by looking at the difference between the current data point and the previous 5 data points to see if the current data point increased from the previous few, decreased, or stayed about the same. This is a "derivative" in the sense that we don't actually need the computed value of the derivative, just the sign. After computing this "derivative", the final step is to scan the "derivative" for where it first goes negative after a period of zero change and/or increasing, as this represents the data hitting a maximum, which it does as the foot makes contact with the ground. This process is done to all 4 IR sensors (Heel, Left, Right, and Front) in tandem, and then any region where a majority (3/4 or all 4) detected a step will be denoted as a step overall. Importantly, all the algorithms are designed to work on data that is current or previous only, that way it will work in real time as the data is being read.
With this algorithm to pull out where actual steps are, we now must look at the IR, IMU, and Load Cell data at the times denoted to be steps, and begin developing further algorithms to analyze the Slope of the surface, the Strike pattern, and Pronation vs. Supination.
Next Steps:
April 12, 2018
On Friday, we met up on Flagstaff Hill to collect our first data sets of actual running, although at the time we only had data coming in from the 4 IR sensors; the load cell and IMU were not integrated yet. We realized after the first trial that one of the IR sensors had become disconnected, but went ahead and got more trials for the other 3 sensors, until after 5th trial where we found that a second IR sensor had disconnected and the battery had become disconnected as well. We decided to call it there, and since then Nihar has made significant progress in making the connections sturdier and much less likely to become disconnected. I plotted each of the useful data sets to be able to visualize what a step looked like with the IR sensor data and to start getting ideas about how to best extract the useful components from each step.
On Monday, Professor Mukherjee and Professor Mai expressed concerns over the lack of IMU integration at that point, so I met with Reed on Monday and Tuesday to dig through the datasheets and data collection code associated with the IMU to determine exactly what data it was outputting, how fast it was outputting said data, and importantly how we could better control the IMU. The IMU has since been integrated into the shoe, as has the Load Cell, and we plan to gather data with all the components in the coming days.
I also met briefly with Ben to discuss what was next for the app now that consistent Bluetooth communication had been achieved. Currently, the app has no way of recalling previous data sets, and can only compute on the current data it is receiving, so I am working on figuring out how to save data across runs of the app. It appears that the app can save data privately, but this data is inaccessible to the user outside of the app unless they have root access. This makes it hard to actually collect the data for analysis in developing the algorithms for detecting certain features of the step, but the final app shouldn't need to output it's data publicly like this, so it is probably not worth the significant investment it would take to jump through all of Android's hoops to make the data public. Thus, I plan to develop the app such that it can load in previous data sets and output the data over the console, which is rather hacky, but would suit our needs for developing the algorithms, especially with the small amount of time we have remaining.
Next Steps:
April 5, 2018
This week's progress consisted mostly of prepping for the demo, actually performing the demo, and then changes made to the app/Bluetooth connection code based on suggestions from the demo. I have modified the app to easily connect to a second Pi over Bluetooth so that, when the Left shoe is finished, we can get both Pis on both shoes communicating with the app relatively quickly. I am still working out a good data transfer protocol over Bluetooth, but since last week I have added handshaking to the transfer so that both the pi and the app know when each is ready to send data. Going forward, I want to finish this transfer protocol so that it is error free, and then begin working on the algorithm to determine pronation / supination. We also plan to meet up on Friday to collect more data and begin merging the code we've been working on separately.
Next Steps:
March 29, 2018
Since last week, I have successfully achieved Bluetooth communication between the Raspberry Pi Zero W and the Android Phone App. The app and my data sending program are set up such that any phone running the app should be able to connect to any Pi running the data sending code, not just my own personal devices. The next step was to determine how quickly data could be sent over Bluetooth. Processing's main drawing loop maxes out at 60 FPS, and each time the app can read up to 1024 bytes from the Bluetooth input buffer, resulting in 480 kbps, which should be enough for the data we want to pass.
We also met up on Thursday to gather some data. We had Reed wear the prototype shoe and get some preliminary data from the IR sensors, and then plotted the data. When plotted in Excel, the data matched our expectations for what a step should look like (a peak as the shoe depresses during a step, then a long hump as the user lifts their foot and then re-places it).
Next Steps:
March 22, 2018
With most of the administrative work behind us, we met up this week to go over exactly who was responsible for what going forward. I have tasked myself with developing some of the more software related tasks, such as developing the app the user would use to interact with the shoe attachment, as well as determine how we are going to transfer data from the Raspberry Pi Zero W to the app using Bluetooth.
I have begun work on getting the Pi and my Android phone communicating over Bluetooth. I have gotten the Pi and phone to pair successfully, but I have yet to succeed at full file/data transfer, but I hope to get some dummy data transferring this weekend. I have a rough framework for the app started, consisting of a title screen and a data gathering screen (for now, this data gathering screen just has data coming from the phone's accelerometer itself, since I have yet to get bluetooth data transfer working from the Pi). The app is being developed using Processing, which makes porting the app to a phone exceedingly simple, however it is Java based rather than Python based, so a work around may be required to interface directly with the bluetooth because we plan to use the PyBluez package to aid data transfer, though this may change. Another option is to use ObexFtp to transfer entire files of data rather than transferring the data directly.
Next Steps:
March 8, 2018
We collaborated on getting this website set up in such a way that allows easy updates (since none of us know anything about web design). We've also been preparing the design document, including changes suggested by the Professors/TAs on how to better implement our design.
Because we only have one USB to TTL Serial cable between the three of us, we have been working on setting up the Raspberry Pis so that we can communicate to the Pis over CMU's Wifi, though progress here has been slow. We've also been looking into obtaining more USB to TTL Serial cables should the communication over CMU's Wifi prove to be too much of a headache to get working properly.
The plan moving forward is to start interfacing with the IMU, Distance Sensors, and the Load Cell to collect data from each sensor in isolation, and to ensure the multiple distance sensors will not interfere with each other in our design.