Engineering
Using a Temporal Convolutional Network to Identify Joint Angles from Low-Profile, Transradial Ultrasound Images
Ephram Cukier
Engineering
Ephram Cukier
Each year, there are more than one million amputations worldwide. However, there are currently no accurate and compact prosthetics available to patients to restore functionality to the amputated regions. Most research regarding the construction of a smart prosthetic for the rehabilitation of an upper limb amputation has revolved around the use of analyzing data from surface electromyographic (sEMG) sensors, a sensor that can detect the electrical signals that activate muscles. Although in recent years there has been research regarding the use of ultrasound sensors (US) to identify hand gestures, prototypes have either been bulky or inaccurate. Standard US probes contain 256 scan lines, causing the device to be large and cumbersome to the wearer. Using a modified probe with only 4 scan lines allows the device to remain significantly more compact, at the cost of information. A Temporal Convolutional Network (TCN) functions similar to a Convolutional Neural Network (CNN) in that it can process and extract information from a 2D image. The difference is that a TCN has the added dimension of time, and takes multiple buffered images when processing data, however it loses the ability to draw connections from spatial data. This allows it to learn and observe information about motion in the images, which is valuable when attempting to map the motion of muscles. In this work, a TCN based system was created to use temporal information to substitute for the lost information from the missing scan lines. This approach achieved higher accuracies than the currently accepted CNN algorithms in the field.