The first part of the project was taking the DEAP database and train a model to predict emotion (specifically valence) given 32 channels of EEG data in 4 second chunks. Using this, we can create a system where we can stream data from sensors in real time and predict an emotion, which we can use to change our VR environment to respond to a user's mood.