Week 3
— GAD174 —
— GAD174 —
UX is looking at overall picture of your UI.
Basically, UX is "how do we get someone's journey from point A to point B as great as possible?"
Don't just guess with UX—that is what research is for.
People in UX design may get into doing journey maps for their process. This basically breaks down aspects of UX.
You do not need a journey map for your brief. This is just a "good to know" for now.
UI is basically getting important information to the user. Common UI elements in games include health bars, inventories, ammunition, etc.
There are four types of UIs in digital game spaces:
Non-Diagetic - Heads Up Display (e.g. ammo displayed on the HUD).
Diagetic - User interface that the character can see (e.g. ammo displayed on the gun).
Spatial - Exists in the game world, but only viewable to the player (e.g. an object pick up icon hovering in the scene).
Meta - Not shown in the game space, but exists in the world (e.g. blood splatter on the screen when your player takes damage/dies or text shown on a separate panel with cleaner font for accessibility).
Persona 5's UI is a mix of non-diagetic (for the health bars, etc.), spatial (the control popups around the character) and meta (the dialogue that is attached to the story).
Dead Space's health displayed on the spine of the player's suit is an example of diagetic UI (only problem with that is it's a contradiction because the character can't typically see that, but it's still technically diagetic). Its UI is spatial as it exists in the game world for the player only.
Minimalist UI is not good enough for this class. Minimalist UI is just text floating around in the UI space (e.g. Fire Watch's UI that has subtitles, an objective, the currently held item and contextually relevant controls presented in just text with maybe a backdrop or one or two icons).
Do not forget audio in projects. Do well in putting audio into your projects, but also give the option to turn parts of it off (e.g. music and sound are two different things).
Audio design in game spaces can easily follow the same categories as the UI breakdown mentioned earlier.
There are four types of audio in game spaces:
Non-Diagetic - Audio that is outside the world of the game, such as a musical score and UI interaction without a narrative source.
Spatial - Similar to non-diagetic in that spatial audio that is not a part of the story. It is instead determined by the player's location within the game space.
Diagetic - Sound that is part of the world that the player is in; any audio produced in the game world (e.g. sound effects, ambient environmental sound, game character dialogue).
Meta - Audio which sits between the world of the protagonist and the player (e.g. a game narrator, which is not part of the game world, but definitely adds to the story).
There are four parts of a framework for game audio:
Zone - Ambient, environmental or background sounds within the game space (e.g. weather sounds of wind and rain, city noise, industrial noise, jungle sounds). Typically has one sourced audio track, and is generally not affected by the player, apart from when the player moves from one zone to the next.
Effect - Sounds that are linked to the player's actions (e.g. footsteps from a character moving, gunshots from the player's gun, collisions (e.g. one object hitting the other. This does not include moving from one zone to the next).
Affect - The non-diagetic sounds that provide a setting for your game which exist outside of the game space (e.g. music soundtrack to portray the feel of the game).
Interface - Linked to the non-diagetic user interface (e.g. clicking a button in the UI), generally triggered when the information has been changed.
BY THE WAY, THIS ISN'T A 3D MODELLING CLASS!
Push back on scope for project 1.
Don't do fully fledged inventory system or gun controls.
Current idea for project 1: have the gun and bullet model in the scene, have the gun make some noises when specific parts are pressed (e.g. hammer, trigger), and have a button on the UI that opens a card on the UI which has an image related to the gun (e.g. blueprints) on one side and information about the gun on the other side.
Take screenshots during process of making models/implementing into game engine and have that in your learning journal.
Make sure that you provide the following:
Reflections
Mid-Project (Process, Proficiency, Person)
Final (Appraisal, Challenges, Future Goals)
3D Model
3D Object implemented in Unity project
Workflow documentation (research, design, production)
UI
UI wireframe implemented in Unity project
Workflow documentation (research, design, production)
Audio
Audio implemented in Unity project
Workflow documentation (research, design, production)
Unity Project with your 3D model, UI and Audio implemented.
Documentation should include:
Screenshots/videos/gifs/sketches/etc
Short descriptions of what you're doing.
Don't go overboard with the word count.
[Header] García, J. (2017). Bright Path [Image]. Unsplash. https://images.unsplash.com/photo-1500408698778-2afa347782f3