Music/Audio and Software Development
My main contribution to the game this week was creating a first draft of the main song for our game. The game is going to be a call-and-response based rhythm game. This means that the rhythms and actions that the player must perform are first indicated to them by the song (and visuals potentially) and after that, the player is given an opportunity to perform the indicated actions accordingly. Because of the restrictions inherent in making music for this type of rhythm game, the first draft is very rhythmically simple. That being said, I am hopeful that we will be able to make the song more interesting in the future.
I spent this week creating the new version of our game's main song. Like the previous song, the track is original and was created using Ableton Live 11.
The previous song was deliberately simple. This is because, at the time our plan was for the game to be a call and response based rhythm game, in which the player's response would, if performed correctly, mimic the audio of the "call" section. As such, the rhythms were intentionally simple, and there was a gap every other measure.
Since we have now decided to take a somewhat more traditional approach, in which the player mimics the song's rhythms in real time, I had the freedom to make the song more complex rhythmically and otherwise.
This past week I started working on a new hit-detection system for the dish washing station. Though the current hit detection system is working, implementing new functionality on top of it has proven to be difficult in some situations. As such, we have decided that a collision based hit-detection system would be simpler and easier to build off of.
My main task this past week was finishing up the new, collision based hit-detection system for the dish washing station.
My main focus this week has been getting FMOD properly working with the game, and taking advantage of its functionality as we have planned to.
First, I had to install the Unity FMOD integration and get the music playing properly.
Next, I implemented a rhythmic timer, which receives real-time callbacks from the FMOD event. This allows us to ensure that the game is always in-time with the music.
My last main task was to re-implement the previously existing sfx, namely the hi-hat, and implement the new, batter sizzling sound effect.
This first week of the new term has been primarily spent by setting our goals for the coming term. Personally, I have set out to expand the game's adaptive audio functionality through FMOD.
This past week was spent planning for, and beginning the implementation of the composer/interpreter. This is the system through which we will allow the game to "talk to" FMOD and perform audio-based actions. In this context "audio-based actions" is an intentionally broad and general term: it will encompass anything from the volume of a particular track, to the pitch of a sound effect or track, the ability to trigger a one-shot, etc.
The work distribution for the composer is as follows: will implements the top level composer script, which determines when audio-based actions occur, and I implement the lower level "composer interpreter" script, which handles interfaces directly with FMOD and handles the execution of audio based actions.
This week, Will and I have worked to expand the composer's functionality, implementing volume control and EQ control (which muffles a given sound when active).
This past week has been dedicated to reformatting the original song from the game in such a way as to function properly with our new system. What this entails is exporting a number of "component tracks" (bass, drums, melody, etc) from the original song, and importing them into FMOD, so the track can be recreated. Doing this will allow us to have a larger degree of control over the audio, as we can now manipulate the individual component tracks in various ways with the help of the composer system.
This week I worked with Will to add a new song using FMOD. In so doing, we have established the steps necessary to correctly add new songs to the game.
This week I worked to expand the integration of FMOD with our game by adding new parameters that we can manipulate via code for each track in a given song. The latest addition was pitch-shifting functionality.
This past week I spent looking for new sound effects to implement in the game. This involved a mix of finding royalty free assets and creating some new ones using ableton (digital audio workstation).
This week, the team noticed that there were some technical issues in FMOD due to our pitch-shifting feature. The issue caused all songs to sound somewhat warped, even if they were basically on-pitch. Will and I worked to resolve this issue.
In order to make the gameplay feel more dynamic, we decided to integrate a variety of different hi-hat sound effects for the dish station. Each song has its own set of hi-hat sound effects. Each set consists of three audio clips, corresponding to the players timing in hitting a smudge (3 clips: late, on-time, and early).
My most recent contribution to the game was to implement the song I had created last term according to our new audio integration system. In a previous post I had mentioned reformatting the song (separating it into component tracks), though it hadn't been properly implemented into the game until now.
During this past week, as the team has worked to put the finishing touches on the game, I did my part by integrating menu music to the game. This involved creating the music and adapting the code/creating new scripts to trigger the menu music when appropriate.