A project to create assistive and accessibility systems for danger detection and play-ability using a plugin for the Unreal Engine 4.
— PROJECT NAME
Assistive Systems Project
— ROLE
Game Plugin Developer
— DATE
September, 2019
This project is a plugin that includes assistive and accessibility systems for both danger detection and difficulty assistance. The plugin features two main systems these being: A jump assistance system and a audio to text system.
The plugin offers developers the ability to integrate these systems into their projects using a wide range of options and modifications. Due to the plugin's modular development a wide range of options are clearly available to the developer. These options are all modifiable using the engine settings window, if the plugin needs to be modified it can be easily modified using the commented code.
Dynamic Platforming Assistance
The dynamic platforming assistance is for developers to include the ability to have assistive systems that can be used by players of all levels for certain situations. The developer can choose to use the dynamic system that will detect and place platforms for the user during the game. That can have specific required situations to spawn them in. Otherwise the developers can manually place these systems in the world that will give more control over when and where these systems are utilised.
This system is intended for lower difficulties or to add a dynamic difficulty to the game, giving the player the assistance if they need or want it. The situations of which this should be available to the player would be if they have attempted the jump and failed a certain number of times. Giving them another option of assistance if they need it instead of them quitting the game or getting stuck.
Footage of jump assistance in work in the demo level.
Audio to text System
The audio to text system was developed to further the accessibility used by games. Currently the major accessibility feature is subtitles for dialogue, but sounds are a key feature for game design. The audio to text system turns sounds into a readable format using location based text display of the sound. To keep the design minimalistic but also functional 5 zones of importance were created representing 36 degrees of the players forward 180 degrees.
This allowed for a non-obtrusive accessibility feature to give no additional advantage over normal gameplay just expanding the accessibility of this normal gameplay. Meaning that it makes games more inclusive and as with subtitles would be toggle-able in the settings. This sound system uses a component attached to the player to get the cameras forward direction using a dot product to gather the rotation between the sound and the forward vector of the player. This is then stored and the display then sorts through all relevant stored sounds do display sounds of the most importance.
As with standard hearing the sounds that are the loudest are the most distinguishable and therefore they have a higher priority. As with other systems this was designed with modularity in mind. Allowing the developer to modify the visual aspects of this system using the menu or modify the commented code directly.
Footage of the audio to text system in work in the demo level.
The Danger Detection System
The danger detection system is the main part of the project this is what does all of the dynamic work. It consists of a component attached to the player, every second 5 AI are spawned. They each run a unique environmental querying systems based on the players position. Gathering data on where there will likely be danger in the environment.
The AI have their tick speed increased by 10 times. This means that they are able to traverse the environment in 1 second that would usually take them 10 seconds. Although this increases computation the requirements are minor. It is built to detect danger in the environment, like for the jump assistance systems and also enemies by accessing whether the enemy has an enemy component attached. This enemy component is what alerts the AI that it is an enemy. Relaying this data back to the players control component. This then checks to see what to do with this information.
By default the player is given the ability to see where the enemy AI will travel to next through the use of enemy foretelling. This assistive system is an example of what can be used with the relevant information provided by the AI danger detection. This allows for this systems to be implemented in fully dynamic environments including procedural maps to assess the environment during gameplay.
Footage of the danger detection and enemy foretelling system in work in the demo level.
Overview
This project has been really good, it has allowed for me to explore different aspects of game development. Not only focusing on how to improve gameplay for people normally, but also how to improve gameplay for people with disabilities such as deafness. It is very important that we build great games but even more that we make these games inclusive. I have learnt a lot about how the right amount of assistance in games can help people. This has been a great learning experience and something that I will bring with me into future projects. To keep this idea of helping games to be more inclusive with their design and gameplay to hopefully expand their audiences not leaving people excluded from games. Instead adding features like the ones explored in this project to ideally make more features like these become a fundamental requirement.