My methodology involves targeted memory reactivation (TMR), which can lead to improvements in the task that was cued during sleep. For my study, we cued participants for the mindfulness task, so I analyzed the data on the accuracy of this task before vs. after sleep. We expect that participants will improve on the task post-sleep. I ran a t-test and a Wilcoxon test to analyze this data and created a bar graph to visualize it.
I wrote code to analyze the standard deviation of the breath-to-breath intervals across participants, looking at a few seconds before and after the subject was cued to determine if the cue influences breathing patterns. I ran a t-test on these values and created this bar graph. Since I don't have all my data yet, this graph only uses data from two participants so it is not accurate. Once I have all my data I will be able to run my code again and get a similar graph with correct results.
It is important to look at data across multiple trials, so here I am comparing the frequency of respiration waves. I am looking at four cues right now, and we can see that the blue and orange line up across trials.
This graph shows time-frequency representations, power spectral density, and cross-spectral density for respiration when cued.
I created a violin plot that represents respiration rate across the three conditions. I used a fake data CSV file to depict how many breaths each participant took. Eventually, I hope to rerun my code with my final data and I hope that the graph will show participants taking less breaths when cued vs. uncued, as this would indicate the cue reminded them of the slower breathing pattern from the mindfulness task.
After creating a simple graph to visualize respiration, I used NeuroKit to look at 100 seconds of data in more depth. Neurokit filters and analyzes the data to show metrics that can be useful to look at across participants.
I plotted a participant's respiration to visualize if the respiration pattern is similar while listening to a cue asleep vs. awake. The cued vs. uncued data is taken 100 seconds before and after the cue was played.
These three plots show EOG data, or eye movement data at three points during my study. When you move your eyes in a dream, you also move them in real life, so researchers can use predetermined eye movement patterns as signals for when a participant becomes conscious in a dream.
The first graph show's a participant's eye movements when they were awake and instructed to practice the LRLR eye signal. The second graph shows the lucid signal (LRLR signal) during REM sleep. The last graph shows what a singular rapid eye movement looks like. The code I wrote detects these rapid eye movements and highlights them in red.
This is a graph to accompany the results of a Chi-square analysis on the number of participants that became lucid based on the task they completed. The code I wrote uses fake data to run the Chi-square analysis currently as my data is not fully collected yet. As soon as all of my data is collected, I will be able to run my code with the data and get a graph that looks similar to this, which will show if my results are statistically significant or not.
This is a hypnogram, a graph that displays a person's sleep stages over the course of a period of sleep. This graph uses data I collected during my time in the lab and also shows when cues were played during sleep.
The code I wrote takes my EEG data and automatically sleep stages it, and the colorful graph below shows how confident the computer is in what sleep stage it predicted.