Milestone 3
Milestone 3.1: Implementation
The team has chosen Python as our coding framework for CorText, as all members are comfortable using the language and a large number of tools exist for analyzing and implementing EEG signals as well as simulating mouse movement in Python.
The main tool the team is using is NeuroPype, a program with many tools for real time and recorded EEG signal processing. The OpenBCI GUI has a built-in widget for LSL synchronization, allowing the group to sync up the headset and NeuroPype.
The team was initially using BrainFlow due to its compatibility with the OpenBCI Cyton board, but it was discoverd to have issues with signal reliability and data formatting, causing the group to pivot to NeuroPype
Milestone 3.2: Test
The group began by determining which nodes of the OpenBCI headset were required to effectively gather data from the user. Using the OpenBCI provided widgets, the group determined that the front 5 nodes of the headset were the most responsive, and could be deliberately triggered by actions such as eye movement and blinking. The central and back nodes of the headset were not able to be deliberately activated to the same degree, and thus were ignored in favor of focusing on the front 5 nodes.
Fig. 1. OpenBCI headset with selected nodes
After identifiying which nodes were needed for data collection, and getting the necessary licenses from NeuroPype approved, the group began creating their first pipeline in NeuroPype. Once all required libraries were downloaded , the OpenBCI headset was synced to NeuroPype via the LSL widget, allowing the new software to interpret the data in real time.
In NeuroPype, the group began training their model to try to detect if the user was thinking left or right. The framework of the model was set up based on a NeuroPype provided model, allowing the group to jumpstart its testing based on the provided outline. The Logistic Regression model is used for our initial model setup.
The group also set up a simple program for mouse movement using PyAutoGui, which would take input from the headset and move the mouse a set amount of pixels left or right on-screen.
Fig. 2. NeuroPype output displaying left (purple) and right (red) data from the headset
Fig. 3. NeuroPype's standard Logistic Regression Pipeline Template
Milestone 3.3: Teamwork
Overall: The team's tools and libraries have constantly shifted, requiring constant communication and understanding between members. In addition to meeting during Senior Design class and Senior Innovation class, the team meets once a week outside of class to discuss and accomplish our goals.
Victoria Beke - Prepared the Innovation Expo Abstract, preliminary demographics research.
Zoe Casten - Downloaded, troubleshot, and worked with NeuroPype API.
Janet Hamrani - Research on prior BCI projects.
Christian O'Connell - Wrote preliminary Python code for mouse input.
Jason Pinga - Research on available API options and decided on NeuroPype.
Matthew Vaughan - Composed milestone 3 and compiled documentation for display.