Articles‎ > ‎

The Patterns Of Your Dreams

posted Jun 27, 2015, 8:39 AM by Ellen Pearlman

image

The presented image on the left, and a facsimile of how the brain reconstructs the image on the right.

Japanese researchers Dr. Yukiyasu Kamitani the ATR lab in Kyoto have been using fMRI and EEGs  to scan people’s brains to ‘read their mind’ for quite some time. They show a subject an image and reconstruct that image based on scans. They do this by sticking a subject into an fMRI machine while they are also hooked up to EEGs, staring at the top of the fMRI tunnel looking at images.

image

TV image of the subject in the fMRI on the right, and the scanned image data and tracking devices on the left.

The idea is to eventually let people control electronic devices without using their bodies. For people with ‘locked in’ syndrome this would work really well. This could also, conceivably, make keyboards and buttons a ‘thing of the past.’ However, the ethics and privacy of this practice are dubious at the moment. 

Dr. Kamitani has taken the research one step further in the past few years. He puts subjects into fMIRs with EEG caps on them, asks them to fall asleep, and then wakes them up and ask them what they were dreaming about.

image

This is one part of the brain, popped out in a 3D scan, active during dreaming

image

It might mean something like when all these parts of the brain light up the subject was dreaming of a female

If they wake up enough people, and enough people report dreaming of a female, then they figured out a good statistical algorithm that means “female” when people are dreaming depending on which parts of their brains were active in the scan.  Though it is working with deductive analysis, it is possible to build a statistical probability to figure out what people are dreaming about, at least in terms of broad categories.

image

Here is an algorithm for letter characters. The dreamer reported they had a dream of writing.

*******************

Australian artist Laura Jade, along with neuroscientist Peter Simpson-Young and programmer Sam Gentle made a project called “Brain Light”. The work fulfilled her Masters degree in Illumination Design at the University of Technology, Sydney. 

image

She made a clear perspex plexiglas outline of the brain and etched it with neural networks, or dendrites. Using an Emotiv she used the electrical activity of the brain to translate it into light. 

image

raw brainwave data

The programmer took the raw brainwave data and processed it into both light and sound.

image

Brainwave data translated into light

image

Brainwave data translated into sound

Calm and relaxed states are Alpha waves which they represented as blue light. Agitated or excited states are Beta waves, which they represented as red light. 

Highly focused or absorbed states are theta waves, represented here as green light.

image

Theta waves as green light and sound waves.

This is just the beginning of many more experiments to develop BCI interactions around themed experiences

Comments