Video walkthrough of (Don't) Fear the Reaper's environment and interactable assets
(Don't) Fear the Reaper is a project built for VR headsets (in this case, the Oculus Quest 1) that showcases some fear. The scope of this project can be serious or humorous, but we must still help our user face this fear. In this case, we have a fear of sensible dreams. Many times our dreams do not make sense, and thus, we may wonder if our state of mind is something to be concerned over. Our goal is help the user realize that while there may be many nonsensible things here, it is still just a dream. There is absolutely nothing to worry about.
We start out rather tame: inside of the classroom with our Prof. Andrew Johnson at the front. However, you'll notice... a bean man? Um, let's not look at him for now. The sign next to him says to feed the plants. Perfect, something normal. Go ahead and enter the cubicles and water some plants. Don't forget to grab the blue mug full of water! Be careful: if the plants die, their spirits will haunt you...
To see if you're ready to conquer your fears, click the video to the left and see what's in store. If you're brave enough, follow the installation instructions below!
Download Unity
Select the appropriate Unity Hub download (Mac, Windows, etc.)
Select Unity 2019.4.28f1 from the 2019 tab. Download through Unity Hub
This project is designed for the Oculus Quest. Follow instructions to build on Android devices
Download VRTK
Follow instructions for download on VRTK's GitHub page
Be sure to install to your Unity program folder rather than (Don't) Fear the Reaper's project folder
Build (Don't) Fear the Reaper
Download through GitHub
Open in Unity. Go to File > Build Settings
Be sure to build to Android and click "Switch Platforms" if necessary.
Click "Build and Run." For the first build, it may take upwards of 10+ minutes.
Kenney City models (16)
Kenney Nature models (14)
Bean King & Machete on Wheels
Bean Machine
Bean Man
Token Dispenser
Bean Cat
Water Machine
Morale Sign
Bean
"Feed the Plants" Sign
Fighter Bean
Tokens
Letter A
Rugs x4
Pictured: Desk & chair directly in front of user. Andy and Bean man looking to the right of the user.
Framerate dips at the start of project loading
Pictured: Angered ghost banana tree follows user.
Even when around multiple models, the FPS stays mostly green as ghost plants follow around the user, the transparency effect has a slight impact.
Pictured: 2nd floor view of the room with red lights for dying plants
In corners, the project is loading both the inside and outside and experiences significant lag.
Pictured: Mug exploding when the water inside runs out.
During an in-game explosion, the framerate lags slightly but again fairly consistent and in the green!
Pictured: User feeding Andy beans.
During interactions with a more detailed model, FPS drops slightly
Pictured: Outside scene. Low poly tree with hamburgers underneath
Against the walls of ScareCo, FPS goes down due to partial loading of both inside and outside assets
Refer to "External 3D Model Assets"
Refer to "Internal 3D Model Assets"
Mugs 7. Hamburgers
Beans 8. Tokens
Pizza 9. Takeout
Bags 10. Cans
Apples 11. Rice balls
A's
Bean man ominous breathing
Bean walking in corner
Bean king guarding his bean cat
Chikorita hopping in a circle
Pizza spinning
Water Machine
- Press the pushbutton to get a nice, fresh water.
Bean Machine
- Put a token into the receptacle area in the top in order to get some beans.
Token Machine
- Press the pushbutton to get some quarters shot at you.
Andy
- Feed Andy some beans and he'll give you A's
Red flashing lights across the first floor to signify plant death x8
Bean man's ominous lighting
It is not hard to believe that running a VR game in a simulated environment versus a practical and real VR headset can bring about different results. One of the big differences is the framerate. Anyone who has worked with VR or really any 3D rendering software will know that loading highly detailed assets can take quite a bit of time. In a somewhat related experience, I have seen rendering models for 3D printing take nearly 10 minutes. In our project, we have many different assets, and if we had chosen the nicest or most detailed and realistic models, we would have run into this same issue. Not only would our initial load take quite some time, but each movement the user takes when running it on VR would lag significantly.
Of course, in a simulated environment (either using Unity’s “Play” mode or Steam’s VR simulator), FPS is almost irrelevant. The project may take a bit of time to load, but once it does, the user can move around freely with little to no lagging. However, once the VR headset is on, the user will very likely experience some delays/slow framerate. With the large number of assets we used, it was only natural so we tried as much as possible to use and create low poly models. In some instances (such as going outside of the room), the FPS drops quite a bit despite there only being low poly models. Part of the reason is the program is rendering both the inside and outside of ScareCo.
Aside from FPS, the user may also become overwhelmed, especially during plant death where the lights on the first level begin to flash. Since the headset completely envelops the user’s vision, it can cause discomfort and in theory, seizures as well should the user be predisposed to that. In simulations like Unity’s “Play” mode, the lights are very slight. The lights actually stay as light icons and the flashing is very minor. Other issues like motion sickness cannot be addressed in simulations. The user can pass through the entire space with little to no issues in simulations but be completely unable to handle the actual VR experience. Due to this, it is critical that we as the developers were able to test our functionality on the actual headset to work out practical issues.
However, the benefits of using VR are, of course, getting a more immersive experience. In simulations, the user has their screen and can move with their keyboard and mouse. The user would need to know that they move with WASD keys and switch with the 123 keys. In VR, the user can walk around, they can get up close to our models and be worried when Beanman walks towards them. They can inspect the world around them. It feels real, and the user is not restricted to a 2D view where keyboard and mouse navigation can become frustrating. They can reach out and grab something with control. Overall, simulations provide the ideal case, but there is always a reason to test VR ‒ and an even better reason to experience it in VR.
When testing a VRTK application, the simulator can prove to be a boon and a bane. When testing simple interactions that do not heavily rely on the dexterity that a user is afforded in VR, the simulator can greatly reduce turnaround times of a developer. An example of this is roughly testing out if a blocked-out space can be traversed easily enough, which can lead to faster environment development times and allow for a team to parallelize the blocking-out of levels with other members adding details to areas. Another such example is testing out basic hand manipulations, like pushbuttons and grabbing / throwing decorations. Aside from this, macros could make the simulator useful in more situations – in our cases one such macro that would have been useful is inserting a token into the bean machine, resetting the hands, looking down, grabbing a bean, and resetting hands again. With macros, behaviors that make up the fiber of one’s VR application could also be tested against Unity’s Test Runner and Play Mode tests and allow for greater TDD in VR while preventing bugs from surfacing unchecked.
The simulator should not be used as a crutch. Most design and implementation issues will not surface during small bouts with the simulator – or may be disregarded as janky interactions with smooshing VR into a flat environment. One of the issues we encountered is when grabbing items off of vertical surfaces (such as tables) in the simulator, it was very easy to avoid issues with the user’s body collider. An item on a desk could be reached out to from much farther away in the simulator, the hands could reach across the room without moving the “body.” When reaching down to a lower surface, it’s natural to bend over and reach to grab the item – in the default setup of VRTK with the whole-body capsule collider this will inevitably lead to the user’s head causing their “body” to collide with the resting surfaces which in turn pushes their body back and disorients them while they’re no closer to the item they were attempting to grab.
In the headset it is much easier to manually test complex manipulations in the VR space. If one does automate their testing with the simulator, one could still find great value in manually testing these interactions against Chaos Theory. In the VR space with a person at the helm, there are an infinite number of variations that could occur during a gesture – different angles, speeds, positions. These parameters can help a developer dig out any bugs, especially physics, before sending their creation out to the wild where it will be subjected to everything a developer can test and more.
Design is also very important to check inside of the headset. Are these areas too open and empty? Too claustrophobic? Is the scale appropriate? When the user steers their hands towards different objects are they unintentionally interacting with closely packed items they shouldn’t be? A necessary test as well – is this area going to give a user motion sickness when it doesn’t have to? There are many questions that can only be answered with the headset on, so it is important to strike a balance between both to ensure both high speed development times and quality development.