In this project, we integrate AI to a wearable device and make it smarter. Particularly, we will be working on gloves into which we integrate various sensors such as flex sensors and accelerometers. Using these sensors, we get info of our hand such as fingers position and wrist rotation. Based on the sensors data, we train an AI model and deploy it to recognize various gestures. Every gesture is mapped to an action and the action is triggered based on the gesture recognized.
Some of the applications include:
• PC and gaming: We can use these gloves as a wireless mouse and as a controller in virtual reality gaming
• Controlling drones: Based on the gestures, a drone’s altitude, speed can be controlled
• Controlling a robotic hand: This application can be used in various fields to control a robotic hand such as controlling a robot with hand remotely where humans cannot reach
• American Sign Language (ASL): It can be also used to identify gestures of physically challenged who use ASL to communicate with others
• Home automation: Simple tasks such as switching on/off lights and other devices in a home can also be performed
Since all the hardware should be fit into the gloves, a lot of constraints are introduced both in space and in computational power. All the resources available will be limited such as processor’s clock speed, flash memory, and SRAM. Hence, running a deep learning model in this environment will be highly challenging. Every resource must be used effectively, and the code should be carefully optimized to achieve that. Since the gloves are standalone, they should be operated using battery power only and it again limits some features of the gloves.
We will be limiting our runtime variables to an 8-bit signed integer which will save computation and in turn saves time and power. We should be able to run a deep learning model effectively even on a constrained system. Gloves are quick in recognizing the gestures and triggering actions mapped to them. Bluetooth connectivity must be reliable to send data to other devices quickly to perform tasks.
1) Processor board: Arduino nano BLE Sense
Specs of the board:
Processor: Nordic's NRf52840
CPU Flash Memory: 1 MB
SRAM: 256KB
Clock Speed: 64 MHz
IO Pins: 14
Inbuilt Sensors: IMU sensor, Microphone, Barometric, Temperature, Humidity, Gesture, Light and Proximity sensors.
2) Flex Sensors
3) 10 KΩ Resistors
4) Battery Pack (500 mAh)
The Arduino nano BLE Sense board has 8 analog input ports (A0-A7)
5 flex sensors for each finger is connected to 5 analog pins with 10K Ohms resistors.
Two indicator LEDs one green and one red are connected to digital IO pins as shown in the image below.
When all the fingers are closed for 3 seconds, glove enters into gesture recognition mode. In that mode, flex sensors data is collected for 3 seconds and fed to the model for recognition.
In the first 3 seconds, red LED is switched ON and in the recognition mode, green LED blinks for 3 seconds indicating that input is being recorded.
[1] Karthika Roy, Durga Prasad Idiwal, Annapurna Agarwal, and Bani Hazra, “Flex sensors crowded based wearable gloves for robotic control,” AIR ’15: Proceedings of the 2015 Conference on Advances in Robotics, Article No: 70, pp. 1-5, July 2015.
[2] P. Vijayalakshmi, M. Aarthi 2016. Sign language to speech conversion. In Proceedings of the International Conference on Recent Trends in Information Technology (ICRTIT).
[3] Martin Curic, and Joshua Acousta, “Keyboard Mouse Glove with Bluetooth Capabilities,” 2019 IEEE International Flexible Electronics Technology Conference (IFETC), 23 April 2020.
[4] “Bluetooth Data Module Command Reference & Advanced Information User’s Guide,” Roving Networks, ver. 1.0r 3/26/3, 2013. https://cdn.sparkfun.com/datasheets/Wireless/Bluetooth/bluetooth_cr_U G-v1.0r.pdf.