The Biggest Impact I had on the first Prototype release was helping to get the animation system working and testing out Motion Capture Data. This was something our team wanted to utilize because when it comes to sports games, in order to sell the experience, it not only needs to play good but also need to look good too.
Throughout the first few weeks of the project, we were experimenting with the use of Unreal Engine 5 to see what direction we wanted to take this basketball game. Since one of the core components of the games focused on the animations, that was something I had my eyes on. Starting out with a motion capture (mocap) animation pack I wanted to see how easy it was to start retargeting bones to work on the player character. In the video below, summarized the efforts made during that short time in Unreal Engine.
Although I had trouble in the beginning, I changed my approach of implementation by swapping software from Unreal Engine 5 to Unity for a more concise way of retargeting the bones from the source to the player model. Using the Avatar system Unity provides, I was able to properly map each limb and copy it to any player who has the exact bone set up as the Avatar.
With that hurdle solved, there were two other challenges I had to tackle. Testing to see if we had the ability to do mocap for future animations like dunks, crossovers, and other motions that aren't available in the pack. In addition to that, creating a basic animation system to make sure all the transitions and game play worked so we could iterate on it later.
Researching how to use motion capture was an interesting and fun experience. It made me understand how much range of motion the user can do using an advanced VR Headset. The following was recording using an HTC Vive Pro and Mocap Fusion [VR]
The bones were configured and compatible with the current Avatar we were using. Now the final part was getting a basic animation system working for the prototype. Using the animation packs we had, I started mapping out states to determine when each animation is active and how are they triggered. In addition to that, how long does it take for each animation to transition into each other? One thing from this I can improve on in the future is experimenting with the use of 'Exit Time', the ability to blend one state from the next, to make it them quicker.
Prototype 1's current Animation State Machine
This last video shows the result of this state machine. Overall, getting this animation system to work within a week to be ready for the first prototype release was a hassle but I'm satisfied with the first iteration. In addition to that, I've learned that experimenting with new ideas and seeing if there are better solutions to implementing them was a fun challenge too. In future, I plan on fine tuning this to make it more responsive when the player uses their button commands.