Finished
Dec 1:
Nov 28:
Nov 26:
Nov 25:
Nov 23:
Testing the classifier in real time with the steering wheel (Nov 22/23)
Two hands on wheel (Case 2)
One finger touch on metal portion of wheel (Case 1)
Dirty waveforms for 10k and 18k resistor configuration
3.9mH, 10kΩ, exposed wire touch interface
envelope detector: 1MΩ and 100pF
sweep from 100kHz to 1MHz over a total of 60ms
The two pictures above depict our initial setup of the analog portion of our system. We have a signal generator sweeping through 1kHz-3.5MHz at 0.58Vpp over a 33ms interval, which is then sent through a non-inverting op-amp (LT1360) to boost the signal to 5.8Vpp. This signal is then
We identified that the analog section of the circuit is essentially a notch filter comprised of a biasing inductor and a capacitive component which is introduced by the presence of the user. Having gained this key insight in to the design of the circuit it was possible to simulate the behavior of the circuit with the LTSpice design tool. The picture below shows the simplified analog section of the circuit with an assumed capacitance of 100pF which is a ball park guess of the capacitance of a human being. The input signal was swept from 10 Hz to 10MHz.
As expected the simulation showed a notch at a centre frequency dependent on the values of the inductor and the capacitor. Due to a scarcity of components the design had to be changed to compensate for this. In particular 100uH inductors where not available and had to be replaced by two 3.9 mH inductors connected in parallel for an effective inductance of 1.95 mH. The simulation below shows the effect this had on the system.
The modified circuit shifted the centre frequency of the notch as expected. Despite the this modification the system functioned as expected.
We set up the underlying architecture of our two microcontrollers. The mbed Nucleo F746ZG is able to transmit buffers to the Beaglebone through UART. These buffers contain 200 sampled points from the envelope detector. Because we have not yet set up the rest of the circuit to collect data from the touch interface, we have just been testing with an arbitrary set of data points.
Challenges
We were having trouble enabling two-way communication over the UART interface. The Beaglebone can successfully read data from the mbed, but not vice versa. However, in the interest of time, we deemed this to be sufficient for our needs and did not spend time resolving the issue. After all, the Beaglebone just has to receive a buffer of data and process the results. The final result can be sent back to the mbed via I2C or SPI.
Design decisions
On the Beaglebone, we have two threads, one of which collects data from the mbed while the other runs the machine learning algorithm from the Scikit Learn Python library. In order to ensure that the data to be processed is ready, we established a signaling mechanism between the two threads. The machine learning thread will wait until the data collection thread is finished before obtaining the latest capacitive profile.
CapTouch has two microprocessors, one of which is a mbed Nucleo F746ZG board running an Arm Cortex M7, which handles SPI frequency selection for the wave generation and ADC sampling for the incoming data points obtained from the envelope detector. The other is a Beaglebone Black which will be running feature extraction and machine learning in Python. After classification, the result is passed back to the mbed, which would make a decision make on the detected touch input.
We collect a waveform from the envelope detector, which we call a capacitive profile.