Realizing a character that the player connects with can be done in different ways but the thing I’ve found is that the details are as important as the big pieces.
The details are there to show context, awareness and anticipation.
A system that can have the characters look at things in the world makes the character seem aware of the environment and the world that they are in.
Head tracking systems are implemented so that they add rotations to the characters neck and head based on the difference between the currently animated rotation and the rotation towards their target.
For better quality you also want to spread the rotation over the spine with an increasing strength.
The next step I’ve used after that is to add eye tracking for the last 15 to 20 percent. Humans like to use less effort so they don’t track with their head when the eyes can do the job.
The way this can be implemented in Unreal is via a customized Spline IK node that handles the spine and then a custom rotation solve for the head and neck. Solving the eyes depends just on how the eyes have been implemented on the mesh but generally I found that you want to do eyes with specific joints as they are more precise then blend shapes.
One thing that I like to add is making the character aware of the camera direction. The camera represents where the player is looking and many times where he is going to go and connecting the character with that means that the character anticipates where the player is heading.
Utilizing this in normal locomotion and during standing still is the easiest use case and gives you immediate results. If the player is looking around in awe of the view the character will also do that.
Another nice detail you can add is prediction of slopes with the head so when moving up the head also looks up.
In one game I also implemented the head tracking during climbing so he would look in the direction of climbing. Climbing controls are complicated as they need to be aware of the camera direction as well as the plane of movement. With the head tracking connecting to that climbing direction you can show the player where he would go when pushing the stick. I am personally a fan of systems that help the player in subtle ways.
Also there is no need to stop the head tracking during jumps and other traversal but you may need to constrain it. If the head tracks during a jump it will show anticipation of where the player is trying to reach.
For NPC’s the head tracking can be used to track their target to show who they are focused on and fighting. This can also be used during locomotion as they are circling their opponent.
In one game I implemented very basic head tracking for pedestrians where as they passed other people they would sometimes look at each other as they passed.
For locomotion the NPC’s can also look slightly ahead on their movement path as that gives the character anticipation of their goal. I remember a bug that we had where the target was the full 3d position so whenever the NPC were getting closer to the goal they were looking down.
Head tracking on an NPC can also be used in cover based shooters where you will have them peek out of cover without shooting. This is a hint to the player of where they are aiming.
Also as I’ve previously mentioned using head tracking during conversations but in that situation you also want to tie the tracking with the full body animation so you need to be able to blend it in and out during animations.
A detail that BioWare was using that was new to me was what they called a glance system. This is a system on top of the tracking where the character is allowed to break the tracking to look at something more important briefly.
The important thing can be found by having different priorities to targets so that the characters can swap targets. The priority system needs to be there for normal tracking. If for example there is an explosion a very high priority tracking target activates.
The glance target was there to handle the characters looking at a new character for a quick glance and then going back to tracking. This detail removes the feeling of the tracking being robotic as you can break it up.
A lot of the detail that you see in third person games like Uncharted or Last Of Us come from a lot of contextual animations. Locomotion sets that are swapped with changes in character state. Hot, cold, hurt, carrying another character and so on.
There are not so many shortcuts that you can take there to achieve the high level of fidelity. I’ve done tricks with additive animations that are in sync with the locomotion animations to show a hurt state or noise on top of the locomotion animations. This was implemented in a similar way as the different handling styles. With layering you can reduce the number of animations you need to create but it will never be quite as good as more full body animations.