Wide-Area
Augmented Reality
Theater Experience
Fall 2022 - COMPUTER PROGRAMMING FOR ARTS (ART 22/ MAT 299)
Dynamic Theater
This Augmented Reality immersive dance/theater experience blends real footage of dancers with virtually created avatars played by UCSB BFA dance students. Various methods are explored in the augmented environment to test ways to deliver an Augmented Reality theatre experience. As the story unfolds in the open world, participants are led to the designated spot where contents are prepared in the designated physical layout. User experience is heavily linked to the geological location where the digital content is prepped. There are few challenges that emerge from developing and projecting digital content to the physical world. The class will cover the concept of Digital Twin, Spatial Awareness, World Locking and much more. In this project-based class, you will learn to develop an AR application for a wide area, designing an augmented space where linear story is presented in an open world allows users to hold ownership of the theater play. The project is planned to be published and released as an open-source application as part of effort to introduce new ways to experience geo tagged locations. Be part of this first of its kind theater experience!
"What I hear, I forget.
What I see, I remember.
What I do, I understand." - Xunzi (340 - 245 BC)
"What I hear, I forget.
What I see, I remember.
What I do, I understand." - Xunzi (340 - 245 BC)
Instructor: You-Jin Kim (yujnkm@ucsb.edu)
TA: Pau Roselló Diaz (paurosellodiaz@ucsb.edu), Kaleb Guo (jinjieguo@ucsb.edu)
Lecture Time: Monday & Wednesday 2:00 - 4:50 p.m.
Office Hours: TBD (by appointment only) - Utilize the one hour before and after the class lecture at the eStudio (Arts 2220) as the instructor will be present in those time.
Course Description
Presenting theatre in a wide-area where the user is encouraged to move around from one point to another can present new ways to consume plays. But this method of presentation requires some experimentation and close examination as it combines both open-world experience (Nonlinear Narrative Structure & Storytelling) and tightly controlled theater experience (Linear Story and Chronological Structure).
Dynamic Theater: experimental platform explores stage guidance system as part of the play. Dynamic Theatre presents dancers (3D captured avatar recordings), and computational objects (FBX animation) can perform together, pioneering how 3D reconstructed performers can be projected as digital content in a physical world through head mounted device (HMD). The platform presents augmented reality performance delivered through digital content catered to designated physical geological location. The project further explores hybrid methods creating a wide-area augmented reality experience for theater and dance which caters to specific location landmarks or architectural structures Current state of the project. The platform student will be given for the assignment and final project (GitHub repository) will be based on my recent project: Kirby Crossing Ambulatory Cognitive AR Walking published in IEEE TVCG Journal Paper (ISMAR 2022):
Y. Kim, R. Kumaran, E. Sayyad, A. Milner, T. Bullock, B. Giesbrecht, T. Höllerer: Investigating Search Among Physical and Virtual Objects Under Different Lighting Conditions. In: IEEE Transactions on Visualization and Computer Graphics, IEEE ISMAR 2022
This course focuses on developing and designing 3D art compositions. The course will be operated as a studio class involving 3D art making, implementing game logics, composing music, and choreographing theater productions in Unity game engine. While the class mainly focuses on art making, we will cover ethical questions arising from commercialization of AR and VR devices through seminar style discussion. Course assignments will provide students with the opportunity to refine and develop design principles and evaluation methodologies for the wide-area AR/ experience.
Course Format
The course will be taught in person as a studio-style class, revolving around assignments that will eventually come together as an AR/ VR Application for storytelling. The class material will provide students with the opportunity to refine and craft the user experience in the virtual environment. The first few weeks of the class will require your undivided attention as you will be introduced to the Unity game engine platform. Each student will go through Unity Course on their own, learning the basics so that students can apply creative methods to their 3D environments, fully embracing creativity. While the class focus on the technical side, it is not an engineering class, we explore question revolving around art and visual composition. On some weeks (excluding weeks that assignments are due) there will be activities to be completed outside of class that align with elements of the concepts presented in class.
The student will be introduced to a variety activity that aligns with the course objectives such as but not limited to:
Visiting the UCSB Kirby Crossing location to examine and design the virtual environment layer for the physical layout.
Participating in the seminar discussion, from the weekly reading, conversation around future predictions and adoption of Virtual Reality And Augmented Reality in our lives.
As a project-based course, a student must communicate with their team members to form a structure of a game development team, and bring their specialty and creativity to the project.
All the listed activities are considered as part of the class and expect active participation from each student. This 4-unit course uses a project-based approach and requires a student to follow along with the Unity Learn schedule to familiarize themselves with the Unity platform at the beginning of the course to jump-start the project. Please make sure students can secure the time required by the university guideline. Please note that the third week of October (10/17 & 10/19) will be dedicated to the team project as I will be away for the conference in Singapore. Please scan through the class schedule carefully — found on this website.
Monday Session - [ Lecture, Project Critique, Weekly Project Due ]
Wednesday Session - [ Lecture, Unity Learn, Group Project ]
*On the occasional, Zoom session/ meeting, please make sure to turn on the video to show the participation.
Dynamic Theater
Recorded dancer footage will be played back in hologram, projected to the user’s surroundings - no dancer will be physically performing at the Kirby Crossing. The only person who experiences live is the user wit h HMD. Everything will be digitally recorded and presented which eliminates many safety concerns. The Dynamic Theatre will include a location tutorial and safety guideline training within the application. We learned extensively how we can perform these AR outdoor studies safely from the previous AR campus study: Kirby Crossing Ambulatory Cognitive AR Walking project.
Since HoloLens is optimized for indoor environments [24], persistent tracking and spatial mapping in a large outdoor area is challenging. We used Microsoft’s “World Locking Tools” (WLT). Since we couldn’t use Hololens’s spatial mapping to handle occlusions effectively in our large outdoor setup, we modelled a virtual replica of our physical space to handle occlusion. We used 6 WLT Space Pins to align our virtual replica of the space with the physical space. This enabled us to design the experiment completely in desktop. We used an SFM 3D reconstruction of our experiment environment to model the virtual replica of the area and correctly place the virtual objects in relation to the physical space. The goal is to present a controlled and carefully curated user experience in a wide, outdoor area through mixed reality. The play demonstrates an immersive theater experience catered just for you prepared in the physical location.
Class Participation and Attendance
Project-based learning requires class participation from every student. To ensure that participation is widespread, and that all students have the opportunity to participate, students may be randomly selected to contribute to class discussion. This format will provide you with the opportunity to defend your ideas and to learn from contrasting points of view, skills that will be invaluable in your forthcoming careers. When evaluating your contribution to the discussion, factors such as the following are considered:
Timeliness: Is the comment timely, given the discussion that is taking place?
Accuracy: Does the comment accurately reflect case facts that have not already been stated?
Advancement: Does the comment advance the discussion?
Creativity: Does the comment yield a new perspective and add to understanding of the situation?
Constructiveness: Does the comment help maintain a constructive atmosphere?
A portion of the class will be devoted to small group discussion of readings and concepts from the lecture, and group critique of peer work and presentations. Students are responsible for actively and thoughtfully contributing to these discussions and critiques. Students are also responsible for providing feedback on reading reflections and course assignments by other students.
Quality of participation is more important than quantity. It is possible for someone to talk a lot and receive a low grade for class participation. It is also important that you actively participate in your team discussions. To discourage "gunning," class participation will be graded on a diminishing marginal return basis. Attendance is expected. The student who misses 3 classes will receive a considerably lower grade for this class. If you miss class, the quality of the class discussion suffers, and this can significantly detract from the class experience for other students. Similarly, if they miss class, your experience suffers. If you must miss class, be sure that you clear the absence with me before the class. If you miss a class due to flu/ COVID, I ask that you share confirmation of your positive test from the UCSB campus testing site (UCSB Health Portal -> Communication -> request doctor’s note). If you need guidance on how this is done, I can gladly share information about how to set up an appointment for your COVID test on campus.
Kirby Crossing
For accurate digital content placement in Mixed Reality, having an accurate digital model of the physical location is crucial. I made Digital-Twin of the Kirby Crossing where the Dynamic Theater will take place and act as a stage for a user to experience and walkaround.
What is Digital-Twin?
The Kirby Crossing Digital Twin was captured with the Matterport 3D camera system and the newer Apple mobile platform LIDAR scanning and finalized using Blender 3D and Unity game engine. The VR environment is completely modeled in 3D and include occlusion layer which is used to handle occlusions on HoloLens device. Lastly, the model is utilized to imagine and develop dance choreography and stage design.
34°24'53.0"N 119°50'26.2"W
Unity package file size: 258.962 KB
UCSB buildings near by: Elings Hall, California NanoSystems Institute (CNSI), Kavli Institute for Theoretical Physics (KITP), Materials Department
This top-down map shows the extent of participants' walkable areas and where digital content can be potentially placed.
Kim, Y., & Kumaran, R. (2021). UCSB_3D Campus (Version 1.0.1) [Computer software]. available at: https://github.com/yujnkm/UCSB_3D (accessed 19 May 2022). https://doi.org/10.5281/zenodo.6565136
Kim, Y., & Kasowski, J. (2023). VirtualUCSB, Santa Barbara Campus (Version 1.0.5) [Computer software]. available at: https://github.com/yujnkm/VirtualUCSB (accessed 26 September 2023). https://doi.org/10.5281/zenodo.6565136
Course Goals
Explore models of human creative production and translate human interaction behavior within the computational environment.
Identify what human experience AR and VR technology struggle to emulate, that you would like to simulate in your AR application.
Investigate and implement interactivity that can translate human interaction behavior within computational media.
Explore user sensory components used in experiencing art/ theater and simulate within the Mixed Reality domain.
Use the GitHub repository of the open-source AR theater platform Dynamic Theater to create your own project by forking it and deviating from it.
Related Works
Dynamic Theater: Location-Based Immersive Dance Theater, Investigating User Guidance and Experience
Dynamic Theater explores the use of augmented reality (AR) in immersive theater as a platform for digital dance performances. The project presents a locomotion-based experience that allows for full spatial exploration. A large indoor AR theater space was designed to allow users to freely explore the augmented environment. The curated wide-area experience employs various guidance mechanisms to direct users to the main content zones. Results from our 20-person user study show how users experience the performance piece while using a guidance system. The importance of stage layout, guidance system, and dancer placement in immersive theater experiences are highlighted as they cater to user preferences while enhancing the overall reception of digital content in wide-area AR. Observations after working with dancers and choreographers, as well as their experience and feedback are also discussed.
YJ. Kim, J. Lu, T. Höllerer: Dynamic Theater: Location-Based Immersive Dance Theater, Investigating User Guidance and Experience. In: ACM Symposium on Virtual Reality Software and Technology, Christchurch, New Zealand, October 2023, ACM VRST 2023
Investigating Search Among Physical and Virtual Objects Under Different Lighting Conditions
Augmented reality (AR) via the Integrated Visual Augmentation System (IVAS) has the ability to optimize training and operations by presenting the user with accurate, real-time information. The challenge is that the human cognitive system is limited in processing capacity and any additional information provided in AR has the potential to compromise situational awareness, increase task difficulty for the user, and reduce performance, rather than augmenting performance.
This project has three main objectives: 1) identify the cognitive impacts of ambulatory AR on task performance, 2) identify metrics that predict changes in cognition during ambulatory AR, and 3) identify strategies that will improve cognition during ambulatory AR. To achieve these objectives we will fuse data acquired from multiple sensors that can inform the AR system about the user’s cognitive state, with an emphasis on tracking attention, situational awareness and performance while engaged in a real-world, assigned user task. The selection of sensors is based on evidence in the literature, as well as our previous work. These sensors include those that may be fieldable in near-term training and operational contexts, including eye tracking, accelerometers for gait and head pose.
The results of this project will provide key insights into the impact on attention and situational awareness in particular and how potential negative impacts can be mitigated. In addition, the inclusion of the ambient lighting conditions and virtual/physical objects will set important boundary conditions for deploying AR in realistic usage in outdoor wide-area.
YJ. Kim, R. Kumaran, E. Sayyad, A. Milner, T. Bullock, B. Giesbrecht, T. Höllerer: Investigating Search Among Physical and Virtual Objects Under Different Lighting Conditions. In: IEEE Transactions on Visualization and Computer Graphics, Singapore, October 2022, IEEE TVCG 2022
Exploring Immersive Mixed Reality Simulations and Its Impact on Climate Change Awareness
Traditional communication mediums, such as videos, posters, and newspapers, contribute to challenges in understanding climate change consequences and have yet to effectively communicate the seemingly-abstract concept. To combat widespread climate change misconceptions that halt global mitigation efforts, investigating effective mediums to raise awareness is necessary. We propose an immersive simulation through Mixed Reality to help users contextualize the consequences of climate change through immersion and interactivity. With a LiDAR-scanned physical environment, we constructed a proof-of-concept 3D visualization system that conveyed the detrimental effects of climate change. Our system simulates a personally-relevant experience that allows users to navigate through climate-related disasters, including a spreading wildfire and a rising flood with tornadoes. To evaluate our methods, we conducted a pilot study and collected numeric and qualitative data, including motion sickness symptoms and environment-related statistics. Participants of our study provided feedback indicating the system’s immersive success; users also responded with heightened motivation to adopt more sustainable practices in the future. Results demonstrate the prominent applicability of Mixed Reality visualization systems in raising climate change awareness and aiding climate change contextualization.
Z. Wang, YJ. Kim: Exploring Immersive Mixed Reality Simulations and Its Impact on Climate Change Awareness. In: Asian Journal of Applied Science and Engineering, Volume 12, ISSN: 2307-9584, September 2022, AJASE 2022
Copyright Information and Project Release
My lectures and course materials, including slide presentations and other course materials, are protected by copyright law and by policy of university. I am the exclusive owner of the copyright in those materials I create. Students may take notes and make copies of course materials for your own use as a source of learning. You may also share those materials with another student who is enrolled in this course. You may not reproduce, distribute, or display (post/upload) lecture notes or recordings or course materials in any other way - without my express prior written consent. You also may not allow others to do so. If you do so, you may be subject to student conduct proceedings under the UC Santa Barbara Student Code of Conduct. Every student must submit Course Statement of Understanding before the end of the first week.
For each assignments students will be developing on top of the cloned repository and must include correct citation and set the repository access to private. All project submission will proceed through GitHub in this course. Not meeting the above requirements when submitting your work will result in point deduction. The class materials were prepared from years of my time researching and developing. For students to experience AR development in 10 weeks, I am sharing my scripts, repository, 3D model, original soundtrack, and dancers' footage I directed and own.
In order to use or distribute research outside of the class students must receive approval from the instructor (You-Jin Kim). Such activities include, but are not limited to: competitions, publication to journals or online platforms (paper and video presentation), grad school applications, etc. This information was explicitly stated in the Course Statement of Understanding. Failure to receive proper permission violates ethical standards in the research community and is subject to legal action.
© You-Jin Kim
Santa Barbara