Post date: Mar 25, 2018 9:33:14 PM
This was my second year attending GDC, and third time in San Francisco.
1) My first panel I attended at GDC was the Unreal Engine updates in regard to Ray Tracing for shadows and lights. The panel itself was packed and my friends and I fit in the back rows. The development team first talked about their struggles with faking this effect in the past and road blocks they've hit mostly regarding performance. They teamed up with both Nvidia and Disney Imagineering to progress this technology further and basically just threw more GPU at the program to better handle everything. So the effect of ray tracing basically is more control over hard/soft shadows and how they dither being further away from the subject, as well as highly reflective and combined surface reflection in real time with dynamic lighting. The light also can have slight texture on them, meaning if a light itself has a certain shape with a pattern overlaying it like a hard shape window pane it will follow that to some degree in the reflections. This is improved above faking it with combined lights and light functions. Typically, the crisper and more coherent the shadow is overall, the more expensive it is to run in engine. Ray sampling for the shadows did not come smoothy performance and look wise for some time. The sampling left a lot of noise and grain on the assets/environment. Combining that with metallics and it created this haloing effect around the AO but the team was able to blur it out and smooth those transitions. The team also developed and refined the new bokeh depth of field filter for post processing. For the cinematic shots demonstrated in the presentation, there was one actor who was captured in MoCap who played every character. The dev team around him could see him in engine in real time, and direct how he is placed on set. Overall, the entire demonstration was rendered in real time and captured in Sequencer to get every cinematic shot. I'm not sure how useful it will be for games just yet, as this is fresh technology with super computers not really made for everyday use. But this is promising for further expansion into UE4 for games as well as cinematic entertainment utilization. These functions will be available in UE4 4.19-4.20.
2) I attended a Snapchat LensCreator panel. Within snapchat's entire audience, at least 1/3 use the AI filters provided through the app for at least 3min each daily. Snapchat creates their lenses through their software LensStudio which is completely free and open to the public. Snapchat's AR community has expanded and created its own reddit thread for creatives to share and talk about their work. Snapchat also releases special challenges to have the community create their own filters to for continued engagement. Using Lens studio is extremely easy for anyone familiar with game/3D modeling programs. You first upload the mesh, then import the skeletal mesh/animation. Then decide on the template whether the asset is interactive in space or static like spawning an animation with distance, or making the asset tappable for the person holding the camera. Another is portals where the person holding the camera can essentially physically walk into a virtual portal and see a different environment through the lens. All of their filters are completely and open source so any person can use their existing filters and expand upon in, as well as rendered in real time on a preview for the app. The meshes imported into Lens Studio also have substance painter integration. A panel speaker from Algorithmic was present and showed a brief rundown for non artists of using smart materials onto meshes. I think this is super interesting because Snapchat is working on pushing creative boundaries with their own filters and financial backing, while leaving their filters open source for others. This is steering and opening the technology in ways that it may have not been so far because of the creative community. I am extremely excited to see what comes out with this technology in the next few years.
3) Besides panels I've noted before, I also had the chance to play several top notch indie titles that I'm very excited about. The first one comes from a Norwegian team of ten who creates stylized games with procedural shaders similarly to Breath of the Wild. Their newest game in development is called Mesmer and is a combination of Animal Crossing and Don't Starve, where the player must convince the player to join their cause and eventually throw a successful revolution in the city. The developer I talked to noted that there may be multiplayer implemented into the future. I personally love the combination of game types and overall premise of this title. I am hoping to play this game on release. Here is their WIP trailer http://www.ign.com/videos/2018/02/07/mesmer-game-concept-trailer .
The second is a game which is an almost topdown style shoot-em-up action survival game thats developed from China. There is only one artist on their team who created all of their assets including the vfx on player feedback. Every time the player shot his hand cannon, the word "POP" and a sound came up. The enemies had different states from idle/walking/engaged/running while around the player. The assets in this game seem 2D (could possibly be on 3D shapes for motion in game) and are hand painted. The characters are charming and the mood was fantastic. It made me feel invigorated for my next project. I'd like to make more cohesive color choices and better contrast adjustments as this artist has done compared to my thesis.