Some of the input modalities used by people for interactions are a mouse, keyboard, etc. which mostly are operated using hands. However, most of these devices seem futile when the users are motor-impaired or have constrained body movements. The following project is a virtual keyboard that is operated through eye-gaze rather than hands. The keyboard has a QWERTY layout and is integrated with a customized machine learning algorithm to dynamically adapt dwell time of the keys. The 'dwell time' is the threshold time for which one has to fixate his gaze on a key/button on the screen to perform a mouse click through gaze. $$\tan + \abs x^{3}$$
This is one of the initial prototypes that we built for gaze-typing. Kritika and Vinay (in the image) are using their eye-gaze to type their name. We use a Tobii Eye tracker to track the eyes on a 17-inch screen. While the keyboard can be used by the able-bodied persons, one of its prime importance comes into picture when they are operated by children (motor impaired) at the Spastic Society of India at Chennai.
The red dot on the screen depicts the eye-gaze of the user which is tracked using a Tobii eye tracker placed below the monitor screen. The keyboard is also integrated with visual (yellow boundary for the button on which gaze is fixated) and auditory (pronouncing the character typed) feedback.
The keyboard is in use by children at Vidya Sagar Chennai, (formerly known as the Spastic Society of India ) Most of the children at Vidya Sagar are differently challenged and are motor impaired.
To make a gaze-controlled interface for people with motor impairment.
To reiterate the design of the interface after multiple trials with the users. This includes the placement of the keys and spaces between them, and inclusion and integration of multiple modalities like sound in the product.
To build a customized machine learning algorithm to dynamically update the dwell time of the keys by learning the user’s usage pattern.
A modified two-level gaze-controlled virtual keyboard for people with motor impairment that is currently in use at Vidya Sagar, Chennai.
The keyboard is integrated with a Markovian based approach to dynamically adapt dwell time of the keys to increase typing speed while maintaining accuracy.
This was a small attempt to understand how wearable gloves work and what are the various mediums by which one can integrate haptics into the device. The particular glove I used is 'Capto Glove' purchased from the Company of the same name. Currently, the glove consists of an IMU (Intertial Measurement Unit) that provides accelerometer, magnetometer and gyroscopic readings along the three axes. The glove has many other features like bending force calculation, finger pressure readings, etc. The Glove doesn't support Haptic feedback in the current version.
However, the main task for me was to integrate the Glove within a VR environment and find alternative ways of providing feedback to the user. The initial integration involved understanding how to calibrate the glove and its integration with Unity. Since this was my first time working with an SDK (Software Development Kit), it was a real pain to make the right 'firmware' work with the right 'version of Unity' and get the streaming values from the glove in a Unity scene.
Since the glove doesn't provide any sensory feedback, I built a scene in Unity in which a particular motion of the hand generates sensory feedback to the User (See video below). I mapped the rotational motion of the hand to the linear movement of a square box on the screen. I created a socket to send the 3-D coordinates of the square box to an Arduino which triggers a vibrational motor and an LED. Overall, the rotation of the hand that turns the box to the right side of the screen turns on the motor and the green LED while rotation in the opposite direction switches them off.
This is an experiment for understanding the glove and its working. Further, the motor can be placed on hand or at any other position on the body to provide sensory feedback to the user. While the motor provides feedback to the user, the LED gives visual feedback to the people around.
keywords : #play #haptics #interactiveapplication
Duration of the project : 2 months (Dec '18 - Jan '19)
My involvement : Complete project
While there any many devices in the market now days for children who are visually impaired and many schools have taken initiatives to teach them there is still a huge gap between the children and their parents (who have perfect vision). Through my interactions at various schools, I learnt that many visually impaired children are never introduced to school education by most of the parents due to the visual impairment and the lack of available conversational interfaces available for parents.
Interactive Braille is an application plus a playing device for parents to help their children with school homework in braille.
keywords : #braille #processing #interactiveapplication
Duration of the project : (in progress - seeking funding)
My involvement : Entire coding + user study
A demo of the tool. By clicking the top most circle "A" would pop out depicting leftmost circle in Braille represents "A". Given a perfect combination of chosen circles (dots) one is able to learn alphabets, numbers and symbols.
All the long coffee breaks and discussions to know what we can really do.
These are some pictures from my interactions with students at National Association for Blind, Bangalore.
In the past decade we have seen a great inclination towards use of solar energy around the globe. Whereas companies like VerengoSolar, SolarCity are aiming to bring a clean solution to the environment in coming years, a major population of the world still can't relate how much energy comes from the solar devices like a Solar Panel. When we think of missions like Mars exploration in the future where energy supply becomes heavily reliant on solar energy, it becomes a matter of concern to plan its future availability due to fluctuations in factors affecting solar radiation over time.
Firstly there are not many resources available online to understand about solar consumption and questions like "Whether I can use my electric kettle" or "Can I play X-box while having my AC on for next 4 hours" while having solar panels installed is a common question that comes to mind of an end user. Also resources available talk all about numbers like KWh (Kilo Watt hours) , angle of inclination of sun from solar panel but no one connects to the end user answering all the questions above about his/ her daily usage of household appliances.
I along with my three friends built a tool named 'Helios' that helps understanding how much energy is produced by different types of solar panels based on location and time of the day. It also gives insight on which items can be used together depending on the energy generated by solar panels.
The tool also serves as a purpose of NASA Spaceapp challenge for which bagged the first runner up prize. It provides solar energy recommendations to Hi-Seas crew at Hawaii for which we were Global Nominee for 2017 challenge from Bangalore chapter.
For long description of the project : https://2017.spaceappschallenge.org/challenges/earth-and-us/you-are-my-sunshine/teams/future-martians/project
A play around demo of the tool : https://helios-nasa.000webhostapp.com/
keywords : #datascience #recommendations #solarenergy #NASA
Duration of the project : 2 days
My involvement : Data Science + Design of tool + managing workforce
A video demonstration of the application - Helios
What is an off-road vehicle ?
~ Any vehicle which can be driven on uneven terrain full of and gravels , rocky surfaces with ease.
This project is under Society of automotive engineers which every year holds a competition to build an off-road vehicle from scratch. The project involves three parts:
Virtual round : One has to make a prototype of his vehicle on some software like CATIA/AutoCAD etc. and present it to a panel of judges. Along with the design of the vehicle, judges are also interested in knowing how you optimize the cost and how do you differentiate you car from other participants. Around 150 teams cleared Virtual round all over India during the time when we participated. So did we. After the virtual round the task is to build the vehicle from scratch.
Inspection round : This is a round just two months before the final racing round. A senior researcher from SAE comes and inspects the car based on the safety and whether the car is standardized enough to be raced on the final day. We cleared this round and joined the top 120 teams to participate in the final round.
Racing round : This is a three day event held in Indore, India. It includes various testing tests followed by the final racing round. Test include like Brake test, Steering test, Power train test, Suspension test. After the passing of all these test on first and second day, third day includes racing the vehicle and winning the competition.
While the main challenge of the project was to build an end to end vehicle it was essential for us to know what is there inside a normal sedan/hatchback car and how can we differentiate it from a centrally driven off road vehicle. So I being the vice-captain/technical lead, divided the team into groups, Braking system group, Steering group, Power train and Chassis. Each group used to study online and go to various garages bend down look underneath and meet in a park to discuss what they understood and what they didn't. We had many clashes among ourselves regarding how to build this big monster, for example my understanding of steering assembly may be totally different from a member of the power train group whose main task is to give power to the front wheels and will not go in line with the material I am suggesting for the steering rod which might break due to some stresses from power assembly.
This entire activity went on from February 2012 - Mid April 2012. When everybody was on the same page we decided to get our hands dirty by starting with some tools and get a layout of the car. Each team member was involved actively and we gained more in depth knowledge of parts and assemblies of various cars and we always has an eye that what to install where when we would be making the car. Thus we went to give the Virtual round in Christ university in Bangalore in August 2012. We were grilled for 30 minutes checking our technical knowledge related to car making and budget and our sponsors. We claimed that we will make the entire car in less than 3.5 lacs.
And we did, the reason being that we knew "jugaad" (an Indian way of dealing with things smartly) ~ we installed tires from scraps, used power train from Apache that autos used to install. We took steering assembly from a junkyard , made it right and used central steering in our car rather than left or right. The making of the car can be best described by photos rather than words.
keywords : #SAEBAJA #buildingfromscratch #brakes #steering @powertrain #chassis
Duration of the project : 1 year
My involvement : Technical lead + Understanding of the entire vehicle + Brakes and Steering assembly and installation
The week that went on for installing pedal. Issue with installation was that our vehicle was centrally driven as compared to most of the right hand or left hand drives.
This is the picture we took after brakes were installed. Can you see a blurred cylinder right below the steering rod? It is the brake fluid cylinder.
A non-practical roll cage we made out of plastic
A rare photo after the virtual event.
This was a fun project done by me and my two friends. We built a water rocket using 1 water bottle, 1 pipe, 1 air pump and 3 brains (just kidding).
The idea was to make the water rocket in such a way and if launched at 45 degrees it maximizes its maximum range. We didn't know much mechanics that we know now that could have been gone in the making but our hit and trials worked pretty good.
keywords : #Waterrocket #IITkanpur
Duration of the project : 3 months
My involvement : All three had equal involvement in making the water rocket.
A glimpse of a water rocket actually is.