An AR window into a VR experience
The project consists of two parts - a glimmering VR winter landscape in which to build snowmen, and an AR camera combining interactive virtual elements with a real-world image. The solution allows insights into the VR player's explorations without losing sight of the player. Integration between reality and VR is enhanced as real-world objects can occlude virtual ones.
Snow Landscape
The snow landscape is made of a custom shader that creates a beutiful and uniform snow landscape that deforms when the user and objects interact with the snow, creating a more realistic experience.
To add to the immersion, there is also a light snowfall within the VR environment.
Sparkling Snowballs
Like regular snow, the snowballs sparkle due to the crystals reflecting the sunlight! Built using shaders, this effect dynamically changes when you move the snowball or move your head.
Should you choose to destroy your creations, the snowballs explode in a multi-layered effect created using particle systems.
AR-occlusion
Using filters from the Intel RealSense library, depth data is processed and sent to Godot to make real objects occlude virtual objects and vice-versa. This turns the spectator view into an AR-experience.
Creating with Snow
In order to create a snowball, the player grabs a handful of snow from the ground using the VR controller. To make it bigger, the player places it on the ground and pushes it around. Once a desirable size has been achieved, players can stack snowballs to their heart's content.
Throwing Snowballs
What good is a winter simulation if you can't start a snowball fight? Many users were entertained by testing the limits of their ability to handle objects by throwing, smashing snowballs together, and even attempting to juggle.
The Camera
The AR camera is composed of a RealSense and an HTC Vive tracker, allowing users outside of the VR experience to move the camera and change the viewing perspective.
However, the AR camera does not only show people outside of VR what is happening on the inside. The camera itself is present in the VR scene as a floating, ever-present observer. Watch out - you might just get a snowball thrown your way!
The goal was to create a visual sandbox experience in VR that would also actively engage the spectators. The team also wanted to experiment with the depth sensor available, so improving the experience for spectators of VR experiences was a mini-goal.
VR Interactions
One of the main challenges of the VR experience was designing the interaction with the snow. The use of controllers as opposed to tracking hand gestures provided the freedom to keep tracking actions such as throwing even when the hand was not visible. However, using a trigger button to pick up the snowball caused mild confusion in some users, who had to be guided towards possible actions (most notably the possibility of pushing a snowball on the ground, which was only possible while not actively holding the snowball).
Physically accurate and bug-free snowball stacking posed another challenge. Finding a balance between the influence of elements such as snowball size and relative position to previous snowballs proved very challenging indeed. With time being a factor, the decision was made to look at the angle to prevent creations that obviously did not abide by the laws of physics, although it should be noted that users of earlier implementations found creating such structures entertaining.
AR
The RealSense improves the collected depth data in an IR filter pass. As it turns out, it uses the same frequency as the VR technology, resulting in heavy interference. It was possible to turn it off at the expense of the quality of the noise reduction in the depth data.
Aligning the VR environment and the real world posed another challenge as the camera/VIVE tracker could be assembled at different angles. This was solved by manually tweaking the alignment of the virtual camera until sufficiently good results were observed.
One of the more memorable things about the project was how it was recieved by different audiences. People with little to no experience with development saw the simple graphics and often gave the impression that they did not quite appreciate the difficulty of the problem solved, and often disengaged in conversation. People with experience often engaged in discussions about where the technology may find a use case. It is a very effective example in showing how differences in knowledge and expectations can impact how a project is recieved.
On another note, the lack of a goal for the person holding a camera limited the project in terms of overall engagement. Since the camera perspective was the only thing visible for passerby's, the project did not draw people in, meaning they never got to experience the other part of the project. Although the purpose of the project was to be a tech-demo, it may very well have benifited from adding an additional interaction involving the two players. Since quite a few people passed on the opportunity to try the VR headset owing to personal preferences, it would also have been beneficial to have the option to switch the viewpoint to a fully virtual one. All in all, this would have allowed us to showcase more of our work to more people.
The interactive snow present in the project is implemented after a concept introduced by Andreas Junker and George Palamas. It is essentially built on the concept of placing a viewport to observe a surface and anything in contact with that surface, in order to make real-time changes to a spatial shader.
isaklar@kth.se
antlilja@kth.se
jgorwat@kth.se
hansing@kth.se
shallma@kth.se