I worked on the development of a biosignal-controlled video game that allows users to play Google’s “Dino Run” using only their eye movements. This project combined circuit design, biosensing, and Arduino programming to create an accessible and engaging hands-free gaming experience.
Electrooculography (EOG) measures electrical signals from eye movements, which can be used to control computer interfaces hands-free. Our project uses EOG to create a human-computer interface (HCI) that enables users to control a video game with eye movements.
This technology is especially beneficial for individuals with motor impairments, like those with ALS or Locked-In Syndrome, who struggle with traditional input devices. Our goal was to develop an accessible, cost-effective EOG-based controller for assistive communication and adaptive gaming.
Figure 5: Visualization of electrode placement.
Circuitry & Sensors: Designed and built a circuit incorporating wearable biosensors to detect eye movements and blinks as input signals.
Arduino Microcontroller: The Arduino Due processes the filtered biosignals and translates them into keyboard commands corresponding to the up and down arrow keys.
Software Interface: The processed signals are mapped to in-game actions, allowing the player to jump or duck in the "No-WiFi Dinosaur Game" based on their eye movements.
Figure 3: circuit prototype used to filter & amplify raw EOG signals.
Figure 1: The low pass filter consists of C2 and R2 as indicated and is located between the AD622 and LM741 (highlighted in green).
Figure 2: The high pass filter consists of C2 and R2 as indicated and is located between the AD622 and LM741 (highlighted in green).
Figure 9: Sample setup of an eye-tracker taking relative positioning into account [9]
To validate the system, a prototype was built using widely available electronic components. Electrodes were placed above and below the user’s eye to detect vertical movement. The EOG signals were amplified and filtered to ensure accurate control inputs. Threshold values were adjusted to differentiate between natural blinks and intentional movements to minimize false positives.
Results demonstrated that the system successfully translated eye movements into game actions, providing an intuitive and hands-free gaming experience. The thresholding method allowed customization of sensitivity based on individual users' needs, making it adaptable for broader applications.
Demo Video
Signal Noise: Overcame interference by refining sensor placement and implementing filtering techniques in the Arduino code.
Response Time Optimization: Adjusted sensitivity thresholds to ensure quick and accurate gameplay responsiveness.
User Comfort & Wearability: Sourced a lightweight and adjustable sensor setup for comfortable, extended use.
Figure 4: Example of Blink vs. Looking Up Signal Amplitudes [8]
Figure 6: Recorded EOG Signal (looking down, up, down, up)
This project demonstrates the potential of biosignal-driven interfaces in assistive technology, making gaming more accessible for individuals with limited mobility. The same principles could be expanded for medical rehabilitation devices, human-computer interaction systems, and brain-computer interfaces.
Check out the project images, paper, and demo video! 🎮
Katelyn Kennedy – Bioengineering, UC San Diego
William Chen - Bioengineering, UC San Diego
Wesley Hsu - Bioengineering, UC San Diego
Edward McGee - Bioengineering, UC San Diego
Jerry Qiao - Bioengineering, UC San Diego