The People Split is a 7-minute-long animation made for the University of Utah's machinima class. I worked on a team of 7 (3 engineers, 3 artists, and 1 producer).
The idea was inspired by my love for bananas and an old 3D model of a cartoon character my sister and I used for small animation projects in high school. Though I worked on all parts of the development process, the unique skills I brought included physics simulation and making character morph targets (sometimes called blend meshes or shape keys).
Cloth simulations were baked in Blender and exported as an alembic to be used in Unreal. This gave us very high-quality simulation but was a slow process. The final animations were put together in Unreal's sequencer; so, to check whether a cloth simulation lined up with character movement, I had to import the alembic, run the sequence, and then go back in Blender and tweak the simulation. Making the scene in Blender match what was in Unreal's sequencer, or performing as much of the animation as possible in Blender and then exporting to Unreal would have made the iteration process much faster and give more accurate cloth simulation, but also would require a high upfront cost for only two scenes with cloth simulation.
Early on, I knew it would be important to bake as many physics simulations as possible. Though it's possible to render a sequence in Unreal while running physics simulations live, volatile physics simulations could cause problems with aligning other aspects of our animations (e.g. camera framing). Thus, I used Unreal's take recorder to bake ragdoll simulations. This did cause some problems late in development, when it came to light that it was impossible to apply facial motion capture data onto a take recorder's ragdoll, and that some simulations would need to be re-baked if they were to use facial mocap. However, for the most part, baked ragdoll simulations worked well.
I used a cable component to simulate the rope that our main character uses to descend the side of the Ban-Ana Corp tower, since, by the time we knew we needed a rope, external simulation was not an option. Unfortunately, as far as I could tell, there is no way to bake a cable component's simulation in Unreal, so this is the only physics simulation that was live at the time of rendering. However, I made use of invisible colliders to bound the cable's movement as much as possible.
For the windows that shatter when our main character enters and exits the Ban-Ana Corp tower towards the end of the short, I used geometry collections. Geometry collection simulation is not captured by the take recorder, but can be baked using a chaos cache manager. To get the breaking simulation to match the ragdoll movement, I simulated both the ragdoll and the geometry collection together and had the chaos chache manager and the take recorder run in parallel.
Having expressive, exaggerated facial animations is essential for conveying the silliness of cartoon characters. Prior to The People Split I had some experience with morph targets from playing in Blender. For the project, I researched the morph target values sent by Live Link - a free data streaming tool that sends the results of Apple ARKit's facial motion tracking to Unreal - and created matching morph targets for our cartoon character. I then modified our existing actor blueprint and created an animation blueprint to use prerecorded facial mocap data to animate our character.
Having characters with facial animation elevated the quality of the project, though there is room for improvement.
One unique problem with our character is that there's a redundancy with how the tops of the character's eyes are used. They represent both the eyelid of the character and the character's brow. So when our facial mocap actor squints and frowns, the top of the character's eyes can clip through the bottom of the eyes.
Additionally, there are areas where the character could be more expressive. Our character's jaw movement is not noticeable at some points, so increasing the effect of the jaw morph target or taking the square root of the morph target value (since it's between 0 and 1) for the jaw would have enhanced this effect.