KNB227 CGI: Technologies
By Andrew McLellan
By Andrew McLellan
Concept
Environment
The first thing I did was think of an environment for my music video. I decided on a forest clearing as I want the texture on my MetaHuman to be leaves of some kind. So I had a look around for some inspiration.
I found these images that felt close to what I had envisioned in my head.
I then drew a rough sketch of how I would lay out the environment to get a better idea of what specific assets I would need to get from Quixel and to finalize a look.
(Please ignore my terrible 2D drawing skills) .
MetaHuman
The Inspiration for my MetaHuman design came from Doric the Druid from the D&D movie.
I wanted a Fae/nature-inspired look that would complement the texture changes I have planned.
MetaHuman is pretty limited when it comes to hair/clothing styles so I couldn't get the character exactly how I wanted but I think it gives off similar energy.
Abstracted Look
As mentioned above the initial concept for the abstracted textures was to do something with leaves.
I plan to have the character appear normal at the start of the video clip and then after she touches a flower or a falling leaf, leaves begin sprouting all over her body eventually covering her completely.
The leaves will move around as her body moves and some will fall to the ground as well.
Above are some sketches of how the transition will progress over the body.
I'll start by using Niagra particles to create the leafy texture and the falling leaves. Then I'll need to use some material masking techniques using a material network to pull off the transition which I haven't done in Unreal before.
I'm hoping I can pull it off because I think it will be a really cool effect
Capture
For this test of the production pipeline, I'm using the data captured during the Tuesday, week 3 labs.
Retarget
To begin the retargeting process I first imported my custom MetaHuman into a blank Unreal scene so that I can export the skeletal meshes for use in Motion Builder.
Here are the 5 meshes I exported.
After importing the meshes into Motion Builder I hid the unnecessary joints and then went through and re-mapped the main skeleton so that it was ready for retargeting.
The last step before retargeting was to clean up my folders, I moved all the skeleton meshes into the body folder and deleted the empty ones.
To begin the retargeting process I imported some motion data we recorded in week 2 and found my data.
Once I turned my data into a character I had an issue where some joints were red.
The spine and neck were mapped to the wrong joints, that being spine 1, 2 & 3 were mapped to spine 4, 5 & 6. So I moved them down which I hope was the correct move.
Everything turned green after that so I think it was the right way to do it.
I then zeroed out the joints on the data and the mesh so they T-posed. After I lined up them both up I scaled the mesh to match the data and then connected them.
The connection seemed to be pretty good. I scrubbed through the take and there was no hyper extensions or severe mesh collapses.
There was one artifact however, some of the face mesh was pulled towards 0,0,0 and I have no idea why. Maybe it has something to do with the hidden face rig? I plan to ask Paul during the lab if he knows why this could be happening or hopefully there's an example in the next video.
Edit: Paul has informed me that this is a known issue with Metahumans and Motionbuilder, and to ignore it for now.
There was one more problem I ran into. After saving the mapped t-pose and only saving take 001 then opening a new scene and merging the mocap data back in I get this error:
I don't really understand what this error is, I get that its trying to match to models that arent there but as for how to fix it I have no idea.
If I push the merge through it creates clones of the data already in the scene and when I scrub through the take it no longer animates the mesh.
The only thing I can think of that may be causing this is perhaps me moving the data around in the scene earlier when I was t-posing it, maybe it doesn't line up when I try to merge it again. At this stage, I'm not to sure how to fix it or if it will cause more issues later.
I'll find out soon.
Ok so, After discussing things with Paul it turns out that because the movement data contained multiple actors the naming was incremented on all the skeletons. When I try merging new data into the scene it follows the incrementation and creates a new skeleton rather than combining it with the one I have already mapped.
After a lot of messing around trying to solve this without having to redo everything we managed to get it working by re-exporting the movement data from Shogun with only one actor and then re-zeroing it in Motionbuilder.
Export
Now that I have it all working as intended, I cleaned up the MotionBuilder scene to prepare for Unreal. I first plotted the data onto my MetaHuman and deleted the shogun skeleton. Then I exported my MetaHuman as an FBX.
Unreal Setup
First thing I did in ureal was make a basic forest environment to show my concept to my group members
I started with some trees and stones from Quixel.
I then added foliage to give the forest some life.
Lastly, I played around with volumetric fog and lighting a little bit to alter the feel of the environment to a misty forest.
This scene isn't final, merely a proof of concept to show my group. If they want to go with this environment I will add a lot more detail and tweak the lighting.
Now that my scene is ready for my animation data, I attempted to import the FBX I exported from Motion builder into Unreal.
I was hit with this error. I looked online for a solution and luckily for me its a pretty common problem. To solve this issue I just needed to change the 'animation length' setting on my Import from 'Exported Time' to 'Animated Time'
Once this was done I then used the sequencer to attach the animation to my unreal MetaHuman by deleting the default rig and dropping the FBX onto it.
I then re-added the default rig so that the animation was plotted onto the MetaHuman like in MotionBuilder.
I did want to add some facial animation next and tried to record some on my phone with Live Link, however, my iPhone is too old to support the technology :(
Adjustment
The last step was to go through the animation take and adjust the rig where needed to avoid clipping and hyper extensions.
After scrubbing through the animation there was only one point that needed some fixing where the forearms were twisted upside down. I used additive keyframe animation to tweak the problem joints so that it looked more natural. After that, the animation looked great, minus the shoulders being a bit too wide as the captured data was from a male.