BILDS

Biocybernetically Inspired Lucid Dreaming Simulator

System Description

To me, lucid dreaming has always been a very attractive and mysterious phenomenon, surrounded by lots of skepticism and pseudo-science disciplines. In lucid dreams, the dreamer is aware that he/she is dreaming, allowing to take control of what is happening in the dream and do anything they want. Training meditation and self-regulation has been shown to be one of the most effective ways to induce lucid dreams and allow the dreamer to learn how to control their inner state in the dreams. After researching on scientific evidence of lucid dreaming, I discovered that brainwaves recorded while people are sleeping (electroencephalography - EEG) were key in scientifically proving the existence of lucid dreamers. Then, I decided to use neurofeedback technologies for meditation (wearable brain computer interface - BCI), together with Virtual Reality (VR) simulation and biocybernetic adaptation principles to create the first biocybernetically inspired lucid dreaming simulator. The system was called BILDS (meaning illustration or mental picture in Swedish): Biocybernetically Inspired Lucid Dreaming Simulator.

Design Process

The design of the training simulator followed a classic closed-loop system approach. Brainwaves are collected and streamed to a Meditation State Detector able to separate individual EEG bandpowers called γ (gamma), β (beta), α (alpha), θ (theta) and δ (delta). Those brainwave patterns have specific physiological signatures during meditation states. The Biocybernetic Loop Engine (BL Engine), a Unity toolbox that I designed years ago, is used to get the EEG bandpowers from a wearable BCI system in real time and connect them with the simulation variables (V) in the virtual environment. Adaptive rules are programmed in the BL Engine using logic blocks (visual language programming), allowing individual bandpowers to modify the simulation variables. The Dreamed World Simulator uses the physiological modulation to display the changes in VR, therefore closing the interactive loop.


Diagram of the Physiological Computing System

Prototyping at the Game Jam (University of Waterloo, Spring 2019)

During Spring of 2019, the Games Institute at the University of Waterloo organized a game jam targeting VR games and applications. Together with Dr. Alan Pope and Luis E. Velez, I was commit to design and develop an initial, functional and integrative prototype of the lucid dreaming simulator in VR. Our initial idea was trying to integrate physiological computing and procedural content generator technologies to create a more realistic engine to simulate dreams. A very complicated scenario for a two days jam. After researching similar concepts (e.g., Lucid Loops) and brainstorming the initial concept, we came out with a simplistic but scalable prototype. BILDS uses VR and physiological adaptation to create a perfect experience to train self-regulation skills while simulating lucid dreaming-like situations. We chose the affordable and commercially available Muse BCI, which allows mapping the brainwaves associated with meditation states. Then, we used Unity3D to create the VR simulation and the BL Engine to connect it with the physiologically-adaptive layer. Initially, BILDS has two different levels: a sandbox and a face-your-fear escenario. The sandbox scenario uses the metaphor of empty spaces that are filled with random elements (e.g. trees, animals) and sudden changes of environmental conditions (e.g. rain, day-light) while the dreamer is generating the correct meditative brainwaves. The goal of the sandbox is to replicate senseless dreams and to familiarize dreamers with the interfaces and VR equipment.

The face-your-fear scenario uses a nightmare-like scenario with a gigantic spider in a dark forest. To defeat the spider, the dreamer will have to keep the calm to produce the desired neurophysiological responses associated with the meditation and relaxation states. Again, environmental variables can be used to modify the scene in real time, providing more challenge to the difficult task of defeating the spider. We used a fantastic high quality rigged low poly animals with animation asset for both scenes.

We will keep researching and improving this initial prototype and more information will be available soon!