This ongoing research project explores the use of 3D audio and projection mapping as a means to create immersive experiences without isolating participants from the real world, essentially enabling an imaginary fantasy world to come to life in our own. We employ multiple levels of 3D audio and projection mapping (both directly within and on the umbrella, as well as throughout the room itself) in order to transport the participant into this virtual world.
As such, we can consider the umbrellas, and the overall system of which they are a part, to be an instrument of sorts in that they literally allow us to compose and explore reactive sonic environments in 6 degrees of freedom (6 DoF). Additionally, the umbrellas can also function as a measurement instrument, much like a stethoscope is a sonic instrument for medical examination; however, in this case, through the immersive experiences they purvey, the umbrellas enable us to examine the nature of our own perception of reality.
Ultimately, this research seeks to answer the question:
How can we structure our artistic practice to effectively create (audio-driven) immersive virtual worlds in XR?
In order to address this goal, we consider 3 related research sub-questions:
How can we use Augmented Objects to avoid isolating participants from the real world (due to obvious and/or intrusive technology such as VR head-mounted displays and headphones) by embedding a new (extended) reality into the real world?
On a more practical level, how can we combine multiple levels of spatial audio (near and far field) to create 6 DoF immersive experiences?
How can we use sensor data (and technology in general?) to create interactive environments that support a sense of presence, and in turn, enable immersive experiences?
Finally, I feel it is important to preface all further discussion by noting that this is still very much an ongoing research process, and as such, my thoughts about and understanding of this research are continually evolving.
→ jump to:
At its core, this project exists at the intersection between artistic, perceptual, and technical research, and as such, although the presented output is creative in nature, in general, the primary goal is to explore various issues related to the aforementioned research questions. As such, although the resulting pieces are meant to function as works of art, for me they serve another, arguably more important, function as a part of the scientific method, namely as experiments and explorations with this new form of composition of immersive experiences (in the form of responsive sonic environments and spatial metacompositions, in particular).
Furthemore, I would go as far as to argue that at its core, composition is a process of formalization and organization, and as such, the creation of such a instrument/system (through the development of hardware, software, and control systems), is a form of artistic practice in and of itself. This investigation of Augmented Objects and their capabilities necessitates a sort of ‘feedback loop of exploration and discovery’, where intermediary results are continually evaluated, occasionally through larger projects, such as the original A Day at the Beach immersive performance at REFRESH #3 in September 2020, and my upcoming MA performance in October 2021.
The issues I am currently investigating include:
Umbrellas as objects in a world vs windows into a world
Creation of responsive sonic environments and spatial metacompositions
Questions of interaction, specifically the balance between ‘doing’ and ‘being’
Mapping and transformation of data as a form of composition
The creation of immersion/presence (via SSMs and their acceptance as the PERF)
Superposition of multiple “fragmentary spaces”
Using information from the motion capture system about their position and orientation, as well as that of other virtual and real objects in (the) space, the umbrellas are able to synthesize and playback sound, spatialized over/through/via their 5 speaker dome in real-time via software running on the embedded (RPi) system. A parallel visual process occurs, allowing projection onto the umbrellas and the environment around them (e.g. the floor, walls, other objects).
Core technologies include:
spatial audio - implemented in the room as a whole (ambisonics), and inside of the umbrellas (ambisonic equivalent panning)
motion capture - used to track the position and orientation of the umbrellas in 3D space and feed the data to the spatialization algorithms
projection mapping - to project (location specific) content onto the umbrellas and the floor
automated control systems (agents, swarms, vector fields, etc.) for control of sound location and content
Engineering concepts such as: embedded system design and IoT/wireless networking
System diagram:
The diagram below details how the various involved technologies are organized to create a functioning system. Note that the Space Manager is software (writing in the Processing language) that keeps track of and distributes information about all the umbrellas and various virtual sounds to the entire system.
The computation of speaker gains for sound spatialization is performed using the Ambisonic Equivalent Panning algorithm, which requires a coordinate transformation from world (tracking system) space to local (umbrella centric) space to function. Additionally, distance compensation scaling (a decrease in amplitude as the distance to a virtual sound) and horizon fading (ensuring that a sound is only audible when a sound is above the X-Y plane of the umbrella, i.e. you don’t hear sound that are underneath it) add an increased sense of realism when localizing virtual sound sources. Details of the aforementioned processes are provided in the image carousel below.
The journey from the initial concept to the current state included the following major steps:
First prototype included a tracked umbrella that sent position and orientation data to a Max patch that implemented the core algorithm described above. The result was sent to portable speakers using a wireless microphone transmitter/receiver pair
Then built a pvc pipe frame and used large speakers and amplifiers and ported the code to Pure data on a Raspberry Pi (controlled directly over via touchOSC)
First test with real hardware (small off-the-shelf speakers; tested different power options with Sébastien Schiesser, settled on a power bank)
With the help of Roman Jurt of the industrial design department, iteratively designed and fabricated custom speaker enclosures for speaker and amplifier from the ots option → first tests ‘in’ a real umbrella
Iteratively designed and fabricated custom handle for all electronics (Raspberry Pi, Audio interface, battery), again with Roman → sourced all necessary parts → built and tested final prototype
Manufactured six additional umbrellas
The first proof of concept immersive performance with the umbrellas, A Day at the Beach, took place in September 2020 at REFRESH #3. At this early stage in the project, audio playback inside of the umbrellas had yet to be implemented, so the performance was more of a test of the artistic possibilities of linking the projection mapping capabilities of the system with ambisonic composition diffused in the room.
The original A Day at the Beach composition began as a project I proposed as part of an Electroacoustic Improvisation course I was enrolled in during the Spring semester of 2020. In this class we got together every two weeks with our electronic instruments (synthesizers, laptops, DIY devices, etc.) and practiced joint free improvisation through exercises and jam sessions. Then Covid hit… Naturally, the requirement for remote learning made this course difficult, so I proposed an exercise where I created a score for an interpretation of a day at the beach. Each musician was tasked with playing an element, e.g. the sunrise, waves, birds, people, etc.; we did the exercise twice and I assembled the results exactly according to the score. The initial results were chaotic but promising nonetheless, so I later composed a new mix (and accompanying visuals), picking individual elements from the two versions and arranging them in a very deliberate manner with the goal of creating a more nuanced narrative. The image below shows my notes from this process.
Finally, I created a new ambisonic upmix of the composition, extending the narrative into 3D, e.g. the ocean (sound of the waves) were on one side of the space, the sun rose (sonically) in one corner, and set in the opposite, and the mid-afternoon storm came in from the sea and passed overhead before fading off into the distance on the opposite side of the room (see notes below). My colleague Stella Speziali created visuals for projection mapping onto the umbrella and floor that began very literally, became more abstract during the storm, and eventually returned to something more ‘realistic’ as the piece ended.
Short teaser video of the performance of A Day at the Beach from REFRESH #3 in September 2020