"Using an electroencephalogram (EEG) to measure brain activity while people performed mundane attention tasks, researchers identified brain signals that reveal when the mind is not focused on the task at hand or aimlessly wandering, especially after concentrating on an assignment.
Specifically, increased alpha brain waves were detected in the prefrontal cortex of more than two dozen study participants when their thoughts jumped from one topic to another, providing an electrophysiological signature for unconstrained, spontaneous thought. Alpha waves are slow brain rhythms whose frequency ranges from 9 to 14 cycles per second."
If you scroll down you will also see how they conducted the test.
Key information obtained:
EEGs and fMRIs are technologies that are non-invasive and able to read brain waves
Technology is available at Target and Amazon for purchase that can read brain waves
In 2014, a company named “InteraXon” created a non-invasive device that had electrodes (non-gel) that sat on the forehead and behind the ears of the user. It was able to detect alpha waves, beta waves, and gamma waves. They used this technology to play a star wars shooting game.
Stanford researchers created a project (called NOIR), that can read user’s brainwaves and have a robot execute them
They use a non-invasive hat with electrodes
They use a technique called steady stave visually evoked potential (SSVEP) to read what the user wants to do. This method uses a computer screen as the controller for the user. The computer screen displays what's in front of the viewer. Different objects on display flicker at different frequencies, and when the user focuses on one of them, the helmet can recognize the frequency from the user’s visual cortex. Then the user thinks of the physical actions to take, such as move a bowl, pour a bowl, mix a bowl, and the software can detect each of these from the user’s motor cortex.
Also, they are able to adapt the technology to new users using machine learning algorithms.
Methods involving typing for people who can’t communicate has existed since 1988
Brain computer interfaces (BCIs) are a classification of keyboards that use a multitude of methods to allow people to type
P300 is a method of BCI that looks for a certain stimulus response in the brain to decide which character to type
Motor Imagery is a method of BCI that uses leg movement and arm movement (the mental thought of moving them) to type
SSVEP is a method of BCI that detects when the subject recognizes a certain frequency viewed (flashing images) in the cerebral cortex, and can use that to type
These three can be combined to work in unison to type faster
SSVEP does not require adaptation to unique users
All of these methods use some sort of visual keyboard to type