Design Doc

Title of Project

A Waste of Space

Razor

A satirical VR experience that addresses the carelessness of large corporate companies and their methods of "removing" space debris, indicating a need for a proper mitigation plan in the near future.

Slogan

Space may be endless, but our pollution shouldn’t be

Vision statement & top level summary of your project idea

A Waste of Space is a satirical experience set in the near future, where large corporate companies launch rockets into space and have space stations established to clean up the leftover debris from these launches. During a routine debris cleanup, the player will be confronted with the shocking reality of the method in which these companies supposedly “clean” up all this waste.

Goal of project/Differentiator

The goal of this project is to bring attention to the rising issue of space pollution. It is not a prevalent issue at the moment, however with the rate at which debris is being left out in space, the effects of the accumulation of waste in the atmosphere will have a negative effect on Earth's already declining environmental state in the future. It will also impact space travel which is getting to the point where everyone will have access to it.

Theme(s) of project

The themes of this project are worldview transformation, behaviour change, space pollution, climate change, global warming, pollution and corporate malpractice.

The time period of the project will be set in the near future, taking place in the late 21st century (2080).

Visual style of project

The visual style of the project will focus on realism. Inspirations for the style have been taken from real-life footage from space, as well as other realistic virtual space simulators and experiences. The reason we are going for a realistic style and drawing from these inspirations is because we are attempting to provide the immersant with a realistic experience of being in space, and to show that this is a potential reality if we don’t change things now. We also want to focus on the beauty of Earth early on, but then show an Earth surrounded by junk to evoke a “space junkyard” feeling. To do this we need a high fidelity to draw a connection between the images of Earth people have seen and our future version.

Core desired user experience

a) Desired user experience and how your VR experience ideally transforms immersants

The project is focused on depicting the effects of issues such as climate change, global warming and pollution, demonstrating how they have affected the Earth as a result, as well as how these issues are also spreading into space. From seeing the effects of these environmental issues, ideally the immersant will be moved to do something about resolving these issues and feel the need to protect the Earth, as well as the space around it. While we wouldn’t expect the immersant to actually act out on these feelings, hopefully they might think about these issues on a more day-to-day basis and potentially change some of their behaviours to try to reduce the extent of these issues. The experience will also shed some light on the carelessness of large corporate companies, where they are essentially given free will to do as they please without any regard to the consequences of their actions. If as many people that care about space travel also care about keeping space clean then that may encourage more regulations on these private companies who have no incentive themselves to do so.

b) How is your project taking advantage of the special affordance and opportunity of VR?

The project is making use of the unique affordances of VR by recreating an experience that would normally only be possible by actually going out into space. This ties in to the reason why we are employing a realistic style, as we want our experience to be as close to the real thing as possible, so that the participant understands this is a real future possibility. This is also a unique control mechanism of zero-g that we can simulate in VR and use 3D audio.

c) Relation to course design challenge

This project relates to the course design challenge by addressing the issues of climate change, global warming and pollution. These issues are used as a stepping stone to introduce the topic of space pollution, and while this issue may not be the most prevalent at the moment, it will become a major issue in the future if humans don’t change some of their environmentally harming habits, particularly from private companies.

Introduction

The premise of this project is that the immersant will be playing the role of a space cleanup crew, sent to perform their first cleanup of debris at the space station. During this mission, instead of seeing a beautiful Earth as expected, they will see a world covered in smog and debris left over from rocket launches and satellites, introducing some of the effects of space pollution.

The objective of the immersant within the experience is to clean up the debris around their quadrant so that rockets can take off safely.

The core interaction of the experience will be the immersant using their hands to control their thrusters to move around the virtual gravity-free environment. They will also use their hands to use the magnet attached to their glove to interact with objects and the environment.

Narrative / Story

The narrative of the project is that a space debris cleaner, working for a large corporate company, is sent to a space station orbiting the Earth to clean up debris leftover from previous rocket launches and satellites. When the cleaner arrives on the station there will be corporate videos explaining the background context of the world. The video will mention how some corporations are changing the world by expanding humanity to multiple planets. The video will be satire, poking fun at how these corporations try to highlight how they’re progressing humanity while destroying the worlds we live on. An operator will speak to the cleaner on comms, guiding them through the station and introducing them to the magnet gun. The cleaner will be tasked with collecting space debris, storing it on the station to clear a path for upcoming rocket launches. Once all the debris has been cleaned up, the space station ejects the collected debris into another part of space filled with other ejected junk, and the cleaner discovers that the debris is essentially being relocated. The operator will comment about how we’ve polluted Earth, and now we’ve also polluted space.

VR mechanics & Physical Rig

Core Mechanics

The Oculus Quest 2 controllers are the core mechanic for the environment. Users will be floating during the whole experience due to the lack of gravity. By using the controllers, users are able to interact with a space magnet gun to collect space junk, and other objects within the environment. For the movement, the player will have thrusters on their suit that they can control with the left controller to get around the space station (figure.1). They will also be lying down flat to simulate the effect of floating through space (figure 2). While using the magnet gun and their flight controls they will venture through the space station and collect the debris outside to cleanup space. This controlling method provides some simulation of less gravity and space travelling experience for the users. The immersant puts on the HMD for the whole journey in order to view the environment. The GUI shows the current objective and progress towards it.

Secondary mechanics

The number of users would be one at a time, and the age of target audiences are between 16 to 50. During the exhibition, we are going to provide clear instructions, mechanisms, and some story setting before the VR experience. In the VR experience, we will spectate and remain silent to let users immerse themselves. Help is provided if there are any emergency scenarios. When the whole experience is done, users will be asked to take off the heads-up display and wrap-up with an exit survey. Basic instructions included no jumping, moving the lower body, or leaving the physical rig during the experience, and follow instruction by the space traveller assistant if there are more (figure.3).

Control mechanism. HMD and controllers.

https://www.highsnobiety.com/p/oculus-quest-2-buy-online/

Figure 3 - Procedure flowchart

Envisioned physical setup

For the physical setup, we wanted to use an adjustable bench(figure.4)to let users lie down on their chest. This is to simulate them being floating and to draw a distinction from walking. However, due to space constraints and controls considerations we will be having the player standing up.

Figure 4 - Marcy Pro Adjustable Home Gym Utility Exercise Weight Training Workout Bench

https://cdn.sweatband.com/adidas_essential_workout_bench_2018_adidas_essential_workout_bench_2018_2000x2000.jpg

Actual final showcase setup. Remove the bench and have the player standing.


We also provide a computer lab chair for an extra option if users are not comfortable with the adjustable bench. Our project uses an HMD to provide a 360 view of the environment, which allows them to see and tilt their head around. Since they won’t be rotating around on the bench they can use the buttons on the controllers to activate their thrusters and rotate. We plan to use our own laptop.

Locomotion technique

Option 1: Player lies down on a bench to simulate floating, and moves around and rotates in the world using controller inputs. This simulates the real experience of being in a suit with thrusters on it to propel you small amounts.

Pros:

  • Greater connection to virtual body

  • More differentiation form normal body movements

Cons:

  • Could be more difficult to control

  • Might be more prone to sickness


Option 2: Player sits down on a chair and uses controllers to move around. This avoids some potential sickness and lessens the effect.

Pros:

  • May avoid potential sickness

  • Easier to control

Cons:

  • Less immersive

  • Cheapens the effect of floating in space


Decision:

Option 1. We want the player to feel like they’re floating in a space suit so this will be our main idea. We also have the option of using Option 2 if the participant isn’t comfortable with lying down or gets motion sickness easily.

Inspiration Analysis

This project was first inspired by the Space Walks astronauts go on and how astronauts see the beauty of the Earth causing them to have a change in worldview as a result.

NASA Astronauts Describe the "Overview Effect" in Their Own Words

After doing some research, we discovered that there have already been some virtual experiences created to simulate the Overview Effect. We still wanted to create a similar kind of experience and we decided to relate the idea of the Overview Effect to other real world issues. This led to the idea of having the Overview Effect, but in reverse where we don’t see the beauty but the ugliness of Earth as a result of human actions. The topic of space pollution came up and we thought of using the real world problems of climate change, global warming and pollution as a gateway to introduce this issue that may become prevalent in the future.

Space Debris and Human Spacecraft

For the technical aspects of this project, we drew inspirations from some of the other virtual experiences we found that were set in space, such as using hands to pull as a navigation method in a zero-gravity environment. We also took some inspiration for the visual style of our project from these experiences, as well as from real life space footage.

Immersion Frameworks

(a) In what way will your project support immersion, flow and/or presence etc.?

Our project has strong elements of Imaginative Immersion as it guides the player through a narrative. This narrative will be centered around an evil company, you as their worker and the operator as your guide. The story will have the player become invested in their role as a cleanup crew until the ending where they see the results of their actions. By putting the player in this role we give them a sense of agency in the story by having them cleanup the debris, but taking away that agency at the end to emphasize the message of corporations doing the bare minimum.

We also have strong Sensory Immersion by having the player feel like they’re floating in space in a space suit, controlling the suit through their handheld controls. They will also hear a lot of spatial audio that gives them a sense of presence in the world. Objects will also interact naturally to how the player would expect in a zero gravity environment.

There is also minor challenge-based immersion as the player must use their suit and magnet gun to collect the debris that takes up a small part of the experience.

We create flow by giving the player an introduction in a closed, limited environment so that they can learn the controls and then expanding out into space where they use what they learned to complete the objective.

(b) What type of immersion are you focusing most on, and why? How do you plan on using these to support your overall project objectives and desired user experience?

Our project is primarily Imaginative immersion and Sensory Immersion because they best communicate the feeling of playing a role in an outer space environment. We want the player to go through space as an astronaut in a zero gravity environment and have the 3D audio strengthen the presence of being in that place.

(c) Please explain in detail how your team plans on evoking your chosen immersion aspects.

We plan to create strong Sensory Immersion by first having the player lie down to evoke the floating effect. Then, by using spatial audio cues and thruster sounds, we will give them a sense of being in a spacesuit. The immersant gains Imaginative Immersion through entering the VR experience and learning about the careless company through environmental storytelling. We plan to have videos around the space station with over-the-top brand videos about the company, and the debris will be marked with their logo to indicate that they are the source. This results in the player discovering a link between the junk out in space and the desire for the company to expand.

Why Your Project is Innovative

(a) What’s new/interesting/cool/exciting/different about your project?

Our project is innovative as it discusses the topic of space pollution that isn’t often brought up and not everyone is even aware of the issue. Also, instead of focusing on an issue that is prevalent at the moment, it is more about generating awareness to prevent a problem that will occur in the near future.

(b) Why is your project relevant? How does it provide a meaningful / desirable experience to the users?

Our project is relevant as the issues of global warming, climate change and pollution are currently major problems in the world. The topic of space pollution is related to these issues and if the way we’ve handled these other environmental issues is any indicator of how we’ll deal with space debris, then this issue will become very extensive in the future, as there are already certain issues with space debris at the moment.

(c) For your showcase, what would be your main “selling points”? Why should anyone care about it?

The main interest points of our project are that it takes place in a futuristic setting in space. It also encompasses a topic that not very many people are familiar with and provides some insight as to what this issue may look like in the future. Participants should care about this project, as it brings awareness to the problem of space pollution that will occur if preventative actions are not taken.

User Testing Goals and Outcomes

Test 1

(a) Goals, Questions, and Hypotheses:

Goal: Determine the effectiveness of the narrative to see if it evokes the desired impact.

Questions:

  • Do the participants become emotionally invested in the story?

  • At which point(s) in the story do the participants have the strongest emotional reaction?

  • Does the overall narrative change the perspective of the participant towards the issue of space pollution?

  • Does the overall narrative change the perspective of the participant towards large corporate companies and their business practices?

  • Does the overall narrative change the perspective of the participant towards the issues of global warming, climate change or pollution on Earth?

Hypothesis: The narrative is effective in shifting the worldview perspective of participants to care more about space pollution and be more wary of the business practices of large corporate companies.


(b) Methods:

  • We gave the participant the script and asked them to read it

  • Asked the participant questions after the experience

    • On a scale of 1 - 10 how would they rate the story?

    • What were their initial thoughts about the experience?

    • Did they feel like there was a message in the story?

    • How did they feel about the company in the story?

    • How much do they know about space pollution, and did they learn something new from reading this?

  • Organized the observations based on their ratings of the experience, and how close their understandings of the story, message and company match the descriptions we intended for them.


(c) Results:

  • Users found the story interesting for the most part. They liked the use of satire and suggested we could make it even more satirical than it is. Overall, most players understood the message of space pollution being caused by private companies.

(d) Meta-reflection:

  • We knew the story could be interesting, and wanted to go for a satirical tone but we might have been too safe in that regard as users wanted it to be more satirical. As a result we can modify what the characters say in the script to make it more satirical.


Test 2

(a) Goals, Questions, and Hypotheses:

Goal: Determine if the control mechanism gets people motion sick.

Questions:

  • Do the users get motion sickness easily?

  • If so, does the method of locomotion contribute to causing this sickness?

  • What kind of movement contributes the most to causing this sickness?

  • Was the speed of the movements too fast, too slow, or just right?

Hypothesis: Our form of locomotion does not induce motion sickness.

(b) Methods:

  • We had the participant move around in our prototype virtual world, using our proposed method of movement.

  • Asked the participant questions after the experience

    • On a scale of 1 - 10 do they feel any motion sickness?

    • What were their initial thoughts about the movement?

    • What aspect of the movement contributed the most to motion sickness?

    • Does the movement feel natural?

    • If you got sick, how far into the experience did you start to feel the effects of motion sickness?

  • Organized the observations based on their ratings of the experience, and the types of wording they were using to describe the impact in the second pair of questions.

(c) Results:

There have been little to no reports of participants reporting feeling sick, with a few feeling minor effects during our test. We had various results about the speed of the movements, and how natural the movements felt to them.

(d) Meta-reflection:

Since some users wanted the experience to be faster, we will consider adding some buttons that can adjust the speed of their thrusters or the power of the magnet gun.


Test 3

(a) Goals, Questions, and Hypotheses:

Goal: Determine if not having a floor disorients people.

Questions:

  • Is the feeling of not having a floor too disorienting?

  • Do we need a ground at all times?

  • Will the floating out in space section be any more disorienting than floating inside the closed space station?

Hypothesis: Not having a floor doesn’t disorient players.

(b) Methods:

  • Observe the player as they experience the floating in space prototype

  • Give participants a survey

    • On a scale of 1 - 10 how dizzy did they get just standing in space?

    • On a scale of 1 - 10 how dizzy did they get moving in space?

    • Did not having a floor impact their experience in any way?

    • Did they feel like they wanted to take the HMD off at any point?

  • Organized the results based on the scores and the types of wording they were using to describe the impact in the second pair of questions.

(c) Results:

  • The lack of floor and feeling of floating in space didn’t seem to induce motion sickness in the majority of the participants. The only minor thing we noticed from the interview feedback was that being in space all of a sudden at the beginning can be a bit scary or abrupt.

(d) Meta-reflection:

  • Based on these results, we will continue forward with doing both a section in the space station, as well as out in space where the user may freely move around. We also start our actual experience in side the space station so the player isn’t abruptly sent out into space upon putting on the HMD.


Test 4

(a) Goals, Questions, and Hypotheses:

Goal: Determine if the controls are intuitive and easy to understand for the user.

Questions:

  • Is performing actions with the proposed control scheme easy for the user to perform?

  • Are the controls intuitive enough that the user can easily remember how to perform specific actions?

  • Does using the proposed controls feel natural for performing their relative actions?

Hypothesis: The controls are intuitive for performing their according actions.

(b) Methods:

  • We had the user perform all the different actions in our virtual prototype.

  • Give participants a survey

    • On a scale of 1 - 10 how easy was it to perform the different actions?

    • On a scale of 1 - 10 how easy was it to remember how to perform the different actions?

    • Did any actions feel awkward to perform?

    • Did they feel like they wanted to take the HMD off at any point?

  • Organized the results based on the scores and the types of wording they were using to describe the controls.

(c) Results:

  • Some of the actions felt a little awkward and unnatural to perform, such as constantly holding the grip to hold the gun, as well as pressing the A button to eject the debris from the gun.

(d) Meta-reflection:

  • We are thinking of alternatives to holding the grip to hold the gun, such as attaching the gun to the hand after grabbing it so that the grip doesn’t need to be held down, or making a magnet glove instead of a gun so that an object doesn’t need to be picked up. We are also thinking of remapping ejecting the debris from the A button to also having it be the trigger.


Test 5

(a) Goals, Questions, and Hypotheses:

Goal: Determine how much the audio and visual elements contribute to the experience.

Questions:

  • Do the visuals look realistic?

  • Does the audio sound realistic?

  • Does the combination of audio and visuals create a realistic atmosphere?

Hypothesis: The audio and visual elements create a realistic sense of being in space.

(b) Methods:

  • We had the user play through our prototype.

  • Give participants a survey

    • On a scale of 1 - 10 how realistic did the visuals feel?

    • On a scale of 1 - 10 how much did it feel like being in space?

    • Did the soundscape contribute to feeling like you were actually there in the virtual environment?

    • On a scale of 1 - 10 how easy was it to understand the audio sources?

  • Organized the results based on the scores and the types of wording they were using to describe the audio and visuals.

(c) Results:

The visuals felt fairly realistic and gave a sense of being in a space station. The audio of the operator speaking was too quiet and so the story and instructions weren’t entirely clear.

(d) Meta-reflection:

We are planning on redoing the operator audio to make sure that it is loud enough without peaking too much, that it is a better quality, and that it includes more instructions for the controls in the experience.


Test 6

(a) Goals, Questions, and Hypotheses:

Goal: Determine the effectiveness of the narrative.

Questions:

  • How effective is the story?

  • Does each part of the story make sense?

  • Was any part of the story confusing?

Hypothesis: The narrative makes sense and generates intrigue.

(b) Methods:

  • We had the user play through our prototype.

  • Give participants a survey

    • On a scale of 1 - 10 how interesting is the story?

    • On a scale of 1 - 10 did the story make sense?

    • Were there any parts of the story that were confusing?

  • Organized the results based on the scores and the types of wording they were using to describe the story.

(c) Results:

Certain parts of the story were unclear due to being unable to hear the audio, however for the parts that could be heard, they made sense.

(d) Meta-reflection:

We are planning on redoing the operator audio to make sure that it is loud enough and can be easily understood. We are also going to decrease the volume of all other audio to ensure that they don’t ever drown out the operator audio.


Test 7

(a) Goals, Questions, and Hypotheses:

Goal: Determine how the movement feels to control.

Questions:

  • How is the speed of the movement?

  • Does the movement feel natural?

  • Does the movement induce any motion sickness?

Hypothesis: The movement feels natural and gives a sense of floating and using thrusters to move through space.

(b) Methods:

  • We had the user move around the environment within our prototype.

  • Give participants a survey

    • On a scale of 1 - 10 how fast were the movements?

    • Did the movement feel natural and intuitive?

    • Did they feel like they wanted to take the HMD off at any point?

  • Organized the results based on the scores and the types of wording they were using to describe the movement.

(c) Results:

The movement doesn’t feel too fast and feels natural to control.

(d) Meta-reflection:

Even though the movement currently works fairly well, we still want it to feel more floaty as if the user is in space, so we are modifying it slightly but keeping it relatively the same.


Test 8

(a) Goals, Questions, and Hypotheses:

Goal: Determine the clarity and ease of understanding the instructions.

Questions:

  • Are the instructions clear?

  • Is there any point where the user doesn’t know what to do?

  • Is there any point where the user doesn’t know where to go?

Hypothesis: The instructions are clear at telling the user what to do and where to go.

(b) Methods:

  • We had the user play through the prototype.

  • Give participants a survey

    • Were the instructions given in the prototype clear?

    • Was there any point where the user was unsure where to go next?

    • Was there any point where the user was unsure what they were supposed to be doing next?

  • Organized the results based on the scores and the types of wording they were using to describe the instructions.

(c) Results:

It isn’t always clear where to go next and some users were unsure of what their current task was.

(d) Meta-reflection:

We are going to add visual cues in the form of waypoint markers to help guide the user to their next location. We are also going to redo the operator audio to better describe what the user should be doing next, as well as how to do it. If a user is taking too long to perform their next task, we are going to repeat the instructions to give them a reminder of what they are supposed to be doing.

Prototyping Process

We started with the storyboard of our initial concept. To determine the effectiveness of our narrative, we decided to make our first prototype a paper prototype of the narrative. We wanted to test the story to see if it is impactful and is able to deliver the desired message, so we created a script first. We used the script to enact the story to user-testing participants to see what feelings it evoked for them afterwards.

For our first prototype, we focused mainly on implementing the core mechanics and interactions that would be present in our experience. This included movement, as well as the magnet gun interactions. We had a space skybox with the Earth on it, but all the models were just placeholders consisting of simple spheres and cubes.

For the second prototype, we built out the space station and set up all the narrative cues within Unity. For the most part, this prototype had the entire experience completed, besides some audio sources and 3D models that needed to be added in or replaced.

Development Process

Update 2

(a) Summary

We began prototyping ideas by having each team member contribute an idea that they would like the VR experience to be about. We cut down to two separate topics and brainstormed ideas for each one using sticky notes, as seen in Appendix A. After we decided on the idea about space travel, we searched online videos of VR experiences in space for inspiration, and found topics on the Overview Effect, as well as information and facts about space pollution (Refer to Section 13 and Appendix B). Over the next few weeks, we began to brainstorm ideas for the physical rig, and also started to focus on space pollution as our main narrative for the project. With a more specific goal in mind, we were able to begin brainstorming more details about the story we wanted for the experience. We also finished the storyboard and iterations for the title and the slogan, which are open for revisions as the narrative becomes more developed and solidified.

(b) Discussion and reflection on your team process

Our team doesn’t really need the scrum board because we typically work on the tasks together during our meetings, and do touch ups on our own. For this reason we use the scrum board to track what general tasks we need to complete and their status. We updated it as a team at the start and end of our meetings so that we could see what work was done and what was still left.


Update 3

(a) Summary

This update we focused on user testing and finalizing the concepts. We also wrote out our script to figure out the tone and dialogue. For testing we used both the script and our Unity prototype to test different hypotheses. Based on the results and feedback from our own user testing and from the workshop, we also started to make improvements and changes to our project, mainly in regards to the controls and narrative.

(b) Discussion and reflection on your team process

We have been using our task tracker every meeting to organize assignments, and allows us to visually prioritize what needs to be done next. Building the Unity prototype took longer than expected, but we managed to get the main parts we wanted to test done.

Update 4

(a) Summary

For this update, we finalized everything that was needed for the final showcase. We put together a project poster and an executive summary, and printed it out at Staples. The Unity prototype was mostly finished, however, small adjustments were made to make sure everything works like it should. Lastly, we put together a project pitch video using some stock footage and a screen recording of the Unity prototype.

(b) Discussion and reflection on your team process

At this point of the project, everyone had their own distinct role in their part of the project, and we each contributed a great amount to the submissions. We were also generally able to submit everything ahead of schedule, and didn’t have to rush much at all to get things done.


Critique

From the inspiration aggregation discussions we analyzed our original idea and came to some changes we wanted to implement.

(a) What feedback did you receive? Please summarize here.

Our idea of recreating the Overview Effect was too similar to experiences that had already been done, and the narrative lacked interaction. After presenting our pitch, we received a lot of feedback concerned about motion sickness. There was also a critique that the experience makes it hard for the user to care more about space pollution, because it doesn’t show how it personally affects the individual player, and it is difficult for each individual to make contributions that can go towards solving the issue.

(b) How specifically will (or did) you incorporate them into your current project? If you did not, please justify.

To adjust this, we shifted our idea to incorporate more physical tasks and interactions for the player. We also shifted the focus of the project from the Overview Effect to Space Pollution and private space flight to be more focused on one specific narrative. To reduce motion sickness, we will allow the players to control the movements with their controller, and avoid any camera rotation not done by the player while floating through the environment. Players will also have the option to stand or use a chair during the experience instead of using the bench. An idea that we have so far to make players care more about space pollution and its effect on Earth involves actually showing Earth in its true beauty at the beginning of the experience before revealing what it really looks like with all the pollution surrounding it. By seeing the before and after state of the planet, players can begin to see how space pollution can affect the world we live in.


Update 2

We received useful feedback from the final showcase, and while it is not currently implemented into our VR experience, it is useful for determining how we could further improve our experience.

(a) What feedback did you receive? Please summarize here.

The instructions for our method of movement were not clear enough and had to be explained by someone from outside of the VR experience. It was also easy to miss seeing the company brand video within the station, as well as the debris ejecting sequence at the end.

(b) How specifically will (or did) you incorporate them into your current project? If you did not, please justify.

To clarify our method of movement, we would include additional operator audio telling the user that they move in the direction that they are looking. To make the company brand video more visible, we would increase the size of the trigger that activates it, as well as move it to a different section of the station, somewhere not located on a corner so that the user is more likely to see it. For the ending eject sequence, we would set it to activate from a trigger instead of a timer, so that the user has to make their way over to the cargo container, preventing them from missing the ending.

Equipment needs

  • Bench (provided by us)

  • Chair (if bench not available, not enough space in lab, or immersant isn’t comfortable with it, lab chair will be fine)

  • Oculus Quest 2 head mounted display, and controllers

  • Personal laptop for unity

  • Noise Canceling Headphones (from Surrey SFU library)


Technical Documentation

How to run the game:

  • Go to project files folder

  • Download FinalBuild.zip

  • Run the exe (may require using an Oculus HMD as the bindings are setup for that)

Code Architecture:

  • Narrative Architecture

    • Most of the sequencing is handled by the NarrativeScriptingController.cs, which tracks the story step and triggers anything that needs to be triggered when hitting certain steps. Some story steps are based on time, which is tracked in that step’s data, while others are based on external triggers that must be met (the Public functions).


  • Player movement

    • Modified the Player prefab that comes with Steam VR to include our PlayerController.cs script. This script sets up the keybindings with the Steam VR input actions and applies motion to the player based on the value of those bindings.


Appendix A: Documentation of Ideation Process

Ideation of our two initial topics of a project focusing on the Overview Effect, and a project focusing on racism. We decided to go with the Space idea because we believe it could result in a more impactful experience and one that is lesser known. We also decided to focus specifically on space pollution as that is a new issue and one that we can deliver a clear message about while contributing to our overall goal of climate change and pollution.

  • Magnet Gun

    • Created a MagnetGunController.cs script and attached it to our gun object. This handles the pulling of objects towards the gun when the Attract button is being pressed, attaching that object to the gun when it is being pulled and is close enough to attach, and ejecting that object when the eject button is pressed.

  • Airlock Controller

    • Collider controller for the airlocks which triggers an event in the narrative controller.

  • AnimationAudioPlayer

    • Component to attach to objects with an Animator component, so that their animation clips can call the PlayAnimAudio function at specific points in the animation.

  • BrandVideoController

    • Collider controller for the brand video which plays the video when entering the collider.

  • CollectionColliderController

    • Collider controller for the debris collection area which triggers an event in the narrative controller.

  • Constants

    • Static constants used throughout the project for consistency.

  • DataClasses

    • Custom classes for each step in the story which store the data associated with that step.

  • DebrisObjectScript

    • Data for debris objects like the distance to attach themselves from the magnet gun

  • GunStationController

    • Collider controller for the gun station which triggers an event in the narrative controller.

  • MagnetAttachPointController

    • Attaches an object being pulled if it enters this collider area.

  • MagnetCollisionController

    • Pulls objects if the pull button is held and they remain inside the collision area.

  • MagnetController

    • Main magnet gun controller which handles Pulling, Attaching and Ejecting of debris objects.

  • MasterScript

    • Sets application frame rate and scene name

  • NarrativeScriptingController

    • The main script for the game. It controls the narrative scripting and triggering of events based on story steps.

  • PlayerController

    • The main player controller. It moves the player based on input from the controller.


Problems and Solutions:

  • We didn’t encounter any major technical problems aside from some VR motion oddities. Things like hands or other objects getting stuck on the player body and causing the tracking to become shaky. To fix this we removed the collision from the hands and the player in the physics simulation. We couldn’t do this for debris objects as the player needed to be able to collide with them as they float around for realism but we still needed some way to prevent collision when the object was attached to the gun so that it doesn’t get stuck against the player collider. To do this we temporarily remove collision between the player and the debris object while it’s attached. While it can cause the debris to clip through the gun model in some edge cases it was worth the tradeoff to make it feel more natural.

  • One other issue we had was lag caused by highly detailed objects. To solve this we used some tricks like swapping out distant debris objects with the Unity Capsule object and curated our selection of debris models to only include the most performant ones.

  • One issue we overcame was needing to apply our custom logo to the textures of the space station and debris. Due to time constraints we were able to modify the textures for the debris but there were too many various space station assets to modify and not enough time. To solve this we created a 3D model for the logo and applied it to the finished space station. This way we didn’t have to fix every asset in the space station pack.


Project Files: https://drive.google.com/drive/folders/1I4ucvVCQZdRPK3o1r-ZM4lc-MRA1xPc9?usp=sharing

  • The source code is in the A Waste of Space Final Editor zip file, under the assets -> scripts folder.


Appendix A: Documentation of Ideation Process

Ideation of our two initial topics of a project focusing on the Overview Effect, and a project focusing on racism. We decided to go with the Space idea because we believe it could result in a more impactful experience and one that is lesser known. We also decided to focus specifically on space pollution as that is a new issue and one that we can deliver a clear message about while contributing to our overall goal of climate change and pollution.

Appendix B: Sketches & Misc