The Blended Reality Performance System (BRPS) is an important effort within the Balanced Blended Space (BBS) framework development initiative. BBS seeks to provide a system to describe the interactions between physical and virtual environments, as well as between human and AI agents. The BBS framework is central to the way BRPS is envisioned because it provides a philosophical and structural foundation for how real-world and virtual components, along with cognitive and computational intelligences, can interact symmetrically.
BBS proposes that within an ideal system the interaction between different spaces—physical, virtual, and conceptual—should be symmetrical, meaning that all elements (whether cognitive or computational) are treated as equal partners in a shared reality. In the context of performance, this means that both human and AI elements contribute meaningfully, without one being a mere tool for the other.
BBS supports the idea of multiple sensory pathways (visual, auditory, tactile, etc.), allowing for complex communication and interaction between virtual and physical realities. It also emphasizes a blended reality, where these interactions occur simultaneously, leveraging technologies like AI, AR/VR, and projection systems. Due to the nature of today's increasingly complex combinative reality, traditional categorizations (VR, AR, etc) no longer serve as a descriptive language.
The BRPS builds on the BBS framework and applies it specifically to the field of performance, such as theater, dance, or interactive art. It’s designed to facilitate real-time interactions between virtual and physical performers (e.g., actors, puppets, projections), as well as between audiences that may be present in both the physical and virtual realms. The BRPS is essentially the operational system that makes the blended space function in performance contexts.
Key Components of BRPS:
Real-Time Synchronization:
One of the core elements of BRPS is the ability to synchronize real-world physical performers and objects (like actors, puppets, and set pieces) with their virtual counterparts (avatars, digital puppets, projections, etc.).
The system leverages sensors, robotics, AI-driven virtual actors, and motion capture to ensure that interactions between virtual and physical elements occur smoothly and in real-time.
AI as Collaborative Partner:
BRPS aligns with the BBS principle that AI should be a collaborator rather than a tool. This means that AI can be involved in controlling virtual puppets, generating real-time narratives, or even interacting with physical performers through pre-programmed behaviors or adaptive responses.
The AI would also be involved in decision-making processes, such as adjusting performance elements based on audience engagement or real-time environmental inputs.
Cross-Reality Interaction:
The BRPS aims to create a blended space where physical and virtual actors, objects, and environments interact seamlessly. For example, a virtual puppet could interact with a physical set, or an AR projection could respond to real-time movements of a performer.
This cross-reality interaction expands the possibilities of storytelling, creating a unified experience for both physical and virtual audiences.
Modularity and Scalability:
The BRPS would ideally be a modular system, meaning it could be adapted for various types of performances—from small, intimate productions to large-scale theatrical shows. Its scalability is key to integrating both low-tech physical components (like shadow puppetry) and high-tech virtual elements (like VR environments or AI-driven actors).
Modularization also allows BRPS to be used across different types of projects, whether narrative-driven (such as shadow puppetry) or more abstract, immersive experiences (such as installation art or virtual reality performances).
Integration of Multiple Sensory Pathways:
Since BBS is based on the idea of using multiple sensory pathways, BRPS will likely integrate visual, auditory, and haptic feedback into its performances. This could mean synchronized lighting and projection systems, real-time soundscapes, and even tactile feedback for the audience (whether through wearables or haptic technologies in VR).
Audience Interaction:
The BRPS system incorporates audience engagement, where the audience (both in-person and virtual) can interact with the performance and influence its direction. For example, the audience could vote on narrative directions, interact with virtual characters, or participate through gestures or movements that affect the physical performance.
This mirrors the BBS goal of symmetrical interaction between cognitive (human) and computational (AI) agents.
Seamless Transition Between Spaces:
One key to the BRPS is creating fluid transitions between physical and virtual spaces. This means the system would allow a performer or an object to move from the physical world into a virtual one (through projection or AR/VR) or vice versa, without disrupting the continuity of the performance.
The audience, too, would experience these seamless transitions, whether they are physically present or attending virtually.
Examples of BRPS in Practice:
Virtual Puppets Interacting with Physical Objects: A virtual puppet, controlled by AI or motion capture, could interact with physical objects on stage, responding in real-time to the manipulation of set pieces or other actors.
Projection Mapping for Blended Environments: Projection systems could be used to create virtual environments that overlap with physical set pieces, allowing performers to interact with both. These projections could change in response to the narrative or audience input.
Audience Participation in a Blended Reality: Virtual and physical audience members could interact with both virtual and physical actors through apps, voice commands, or gestures, influencing the flow of the performance.
To develop the BRPS, several specific research and development areas need to be addressed:
Hardware Integration:
Research into the physical components needed for interaction between puppets, performers, and virtual systems (e.g., sensors, motion capture, projection mapping tools, robotic systems).
Software and AI Development:
Development of AI systems that can interact in real-time with human performers, audience members, and physical set pieces. This includes designing algorithms for narrative adjustment and interaction based on sensory input.
User Experience Design:
UX design to ensure seamless interaction for both physical and virtual audiences. This would include developing intuitive ways for virtual audience members to engage with the physical aspects of the performance.
Cross-Reality Synchronization Systems:
Systems that enable real-time communication and synchronization between virtual actors, physical puppets, and set pieces.
Multi-Sensory Integration:
Designing a system that incorporates audio, visual, and haptic feedback for an immersive experience. This could include everything from projection mapping, real-time soundscapes, to touch-sensitive surfaces for virtual or physical interactions.
The BRPS is essentially the applied version of BBS in the performance space, taking the philosophical and structural ideas of symmetry and applying them to real-world performances. The interaction between AI, physical performers, and virtual actors is a direct expression of the BBS principle that no element should be treated merely as a tool but as an equal contributor to the blended reality experience. The system enables the symmetry of interaction that BBS seeks to promote, bridging the gap between virtual and physical worlds.
In conclusion, the Blended Reality Performance System (BRPS) is the practical implementation of the Balanced Blended Space (BBS) within the context of performance arts. It facilitates real-time interaction between physical and virtual performers, leveraging AI and sensory inputs to create a fully immersive and integrated experience. By supporting cross-reality interaction, audience participation, and seamless transitions, the BRPS exemplifies the core principles of BBS, offering a new paradigm for performance systems.