Post date: Apr 02, 2019 6:6:0 PM
MaKayla Hensley
One of the biggest observations I collected was storytelling through Virtual Reality, and how to navigate the player through a story while also keeping the player as stationary as possible. Many games including our own went about story navigation in a unique way.
Ghost Giant kept the story very stationary, sitting the player (a.k.a the Ghost Giant) in the center of the space. If the player wished to rotate, a little click of a button shifted the world slightly while still showing a small chunk of the space that they were looking at before. If the environment changed, it did so through a thematic loading screen of twinkling stars. The whole wrapper of this game has a cardboard cutout, storybook-esque wrapper around it and with the player being a supernatural spirit the only real thing that the player sees of themselves is their hands. If the player was able to navigate the world as this ghost giant through simple joystick or clicking a spot on the map it would detach from the intimacy of the connection between the player and the main character Louis. With everything being a subtle turn away it really builds up the set dressing of the environments and puzzles in a unique way, that attracts the player to look closer rather than bore them with the space they are in. Since you are a supernatural being the soft transition between levels through stars is really effective in keeping the player from feeling like they are just waiting for the next level.
Trover Saves the Universe shows exactly how bizarre the explanation can be for navigation and transitions. You play as a “Couchopian” an alien species that cannot leave their chair. Your dogs have just been taken by some freaky bird creature who is using them as his eyes and gaining superpowers from their life essence. Couchopians cannot leave their chairs, all they can do is rotate around and use their controllers to interact with objects in their world. You control a character named Trover, and together you go out to rescue your dogs. So you play a third person game, in VR, how does that work? In third person games you have to navigate and explore a world, in VR you have to stay stationary right? Squanch Games solved this obscure problem by adding teleportation spaces throughout the game, within eye distance of the player. All the player has to do is bring Trover to the teleportation spot, and with the press of the button the player is brought to the spot Trover is standing on. The very fact you are on a couch with a controller is integrated into who you are in the game. Since this is so out of left field in concept (Justin Roiland and his team’s work do that a lot) the situation is explained to you subtly by television shows, or directly to you by Trover or other characters. Even though it is not an entirely virtual experience where you can roam and explore with your own two feet it is still and effectively immersive experience.
Budget Cuts is a virtual reality game where you explore an office environment with robots. When I experienced this game for myself I could not get too far unfortunately (couldn’t even throw a knife at a robot without getting killed). It is a very dense visual experience where your controllers act as your hands, your inventory, and a tool to teleport yourself across numerous spaces in the office building. While I have no complaints on the story or the modeling the navigation is not handled very effectively. Exploring a large office building using a teleportation gun could impose nausea on the player. Additionally knowing your hands have so many uses and tools along them could prove to be confusing or even intimidating to a new player like myself. Of course it could be argued that such a mechanical feeling hand based toolkit could play in part with the machine based plot of budget cuts, but the whole teleportation tool does not fit in with the narrative of the story at all, and detaches the player even more.
I feel like the subject of stationary storytelling is such a big deal because for the longest time virtual reality stations would take up entire rooms. To be brutally honest no one has that kind of space for that kind of entertainment. Plus there is the additional risk of moving and tripping over a cable, or hitting your friend or once again making yourself motion sick. A good handful of teams have begun to realize that, but still want to deliver effective virtual reality experiences despite the lack of space. Level to level transitions that feel connected to the story also prevents player detachment, making the game a solid consistent experience. Keeping a player sitting still not only reduces hazards of injury, but it also allows the player to spend more time in the game comfortably. A longer play time allows developers to create more of a story that a player can effectively follow without having to take breaks every 15-20 minutes.
Learning this at GDC I want to be able to integrate it into future projects that I work on. Of course it is already established, but House of Minori is something I plan to develop further after graduation. I want to be able to tell an effective story from one stationary spot without the player growing bored of their surroundings or their challenges. Between the puzzle game Ghost Giant and the methods Trover Saves the Universe used with the bizarre extremity I feel like I can effectively apply absurdity and storytelling to it. Additionally if I were to ever begin working in the industry on a VR game’s early stages, I know for a fact I will be able to make my peers aware of these kinds of predicaments and effectively give them creative solutions for stationary VR gameplay.
Tail Coating off of the discussion of virtual reality I also made several discoveries of what things are being done for A.I and Augmented Reality. Plus what these applications could mean for combinations of AR and physical products.
Pico Interactive had a booth on the showfloor. While they displayed their VR headsets they additionally showcased this table light that is able to project games and interact with hand gestures. Added to this was a demonstration of how the light’s augmented reality can work with programmed, 3D printed toy products. There was a collection of pieces that could be altered to make a pathway, along with a cannon and a tank toy item. Once the game begun the path created by the pieces was locked and monsters moved along the fixed path, but during the game the player could move the cannon and the tank to keep the monsters from reaching the end of the pathway. The staff speaking to me also explained that with a small set of pieces they can be applied and used across hundreds of different games. This I found to be particularly intriguing because I have seen the application of toys and games work effectively (Skylanders) and fall flat (Disney Infinity). This kind of direct connection can bring some very compelling gameplay to already known games like Dungeons and Dragons while also bringing new opportunities for a toy and app combined market, with plenty of creative room for those who want to experiment with the tool’s potential.
Another example of this application was over in the Amazon booth, where the company’s A.I home product Alexa was demonstrating how the appliance can do more than order pizza or tell the time. The team there demonstrated how Alexa can guide a family through fun games and immersive experiences using a game called St. Noir. A game that works like Clue, but instead of being against one another the players have to work together to find a killer in a week before they escape the city. As you play the game Alexa plays fitting mood music, lays down what took place and provides you options. When you want to talk to a suspect they speak through Alexa in unique voices and the team can ask the suspect questions to better solve the mystery.
I feel like both of these intriguing appliances are important because while video games are important these showcases demonstrate what technology can do to breathe fresh life into an old art. Family Game Nights could get a very powerful face lift, providing unique immersive gameplay through physical and technological hybrid tools. From a marketing and business standpoint this is super interesting in the sense that there is both the physical product of toys and cards and board games possible. Not to mention the need to have the digital appliance at hand as well. Physical things like toys are always appealing to both children and collector markets, and the digital aspect piques the interest of curious young computer savvy folks. It’s able to catch the interest of almost any demographic or age range.
Interacting with these applications of digital and physical products at GDC I have become very curious. Especially in the sense of the interaction between toy, and digital game projection. I’ve been bubbling up a few interesting ideas of how firstly the digital application of a fun game can be produced and secondly how to produce physical products to coincide with the application. It could be a really cool project creating a hybrid game like this, and in the future whether there is opportunities or not I will certainly consider what ways the digital can be applied to physical products and what physical merchandise can become in the digital world.
Another observation I made at GDC is how visual effects goes more hand in hand with both art and programming than I was previously made aware of. During mixers with other technical and vfx artists, meeting up with Colin Harris (Bluepoint Games) and Dan Bruington (FXVille coworker), and sitting down at a technical vfx roundtable I gained a better awareness of the need to have an understanding of both the art and the tech to deliver even more robust and dazzling effects. With Niagara replacing Cascade, industry professionals have discovered both benefits and downsides. With Niagara being a little more graph based it puts novices at a minor setback due to a lack of tutorials for the newer Unreal effects toolkit. Some obvious benefits are instead of having to make copies of an effect to better suit a different scale or color there is now ways to make vfx instances that can be adjusted on an as needed basis which helps reduce frame rate issues having to call several duplications of different swipe trails for example. It also provides more flexibility to create your own vfx functions for gameplay. It allows seasoned artists to produce more assets quicker, but additionally provides room to dig a little deeper and experiment when time is in their favor.
I feel like this awareness is important for myself personally because I am a very visual learner, and I am heavy on visuals when it comes to visual effects. I feel like while I am able to deliver good results with the knowledge that I have, I am capable of delivering better results. If I begin to dive deeper into the technical side of art I believe it will improve my knowhow for the better. It will allow me to be more efficient with technical vfx programs like Popcorn and Houdini. I will be a more versatile member of a company compare to a duo of a visual effects artist and a digital engineer working side by side.
With this newfound knowledge I plan to pursue some more thorough observations and tests of technical vfx and apply them to my portfolio. Dabble in Houdini, After Effects, see what can be done with Niagara and it’s new features. Drew Skillman was the host of the Roundtable, I plan to get in contact with him along with others about what resources are out there and ready for someone like myself to explore and learn more about these kinds of workflows.