Modular Procedural Rigging

Introduction

More pipeline ideas from David Hunt.

Utility Scripts --> Rig Component Scripts --> Rig Template Scripts

Creat a document with explanations of scripts.

Elemental Structure (Morgan Loomis)

Utility Scripts

common general purpose functions-- scene traversal, namespaces, strings, import/export etc.

Rig Component Scripts

FK/IK chain, Stretchy Spline IK, Multi-constraint, Procedural Fixup

(what are the last two?)

So, how can metadata help with this?

Example--create a "Simple IK Rig Component"

On the metaNode, you have attributes for:

Version

MetaType (Rig Ik Chain, in this case)

MetaChilden (perhaps if it's an arm chain, you'd have a metahand?)

Do Not Touch (??)

Start Joint

End Joint

Ik Grip (control?)

Pv Grip (control?)

Ik Joints -- all the joints in the ik chain

This enables you to have 'math-node driven switch mechanism between FK/IK'

FK/IK switch attr instanced onto all rig controls

Switch/Align enabled by semantic traversal (what does this mean?)

Right click menu

Example: FK/IK Rig Component

Your metanode has attributes for:

Version

Meta Type (rigFKIKChain)

Meta Children

Ctrls (list of controls?)

Do Not Touch (?)

Start Joint

End Joint

Ik Grip

Pv Grip

Ik Handle

Switch Node

Switch Attr (fkik_switch)

Fk Joints -- all your fk joints

Ik Joints -- all your ik joints

Rig Template Scripts

One or more per character or object. Consistent, and customizable.

Main Rigging Pipeline (ie., Xrig)

This would be your main 'autorig' database to build different metadata modules

Alternative to File Referencing (how?)

Distributes updates to rigs/components through MetaNodes (how?)

Database back end (what does that mean?)

Some Autorig Theory (An open source autorig?)

(Kiaran Ritchie)

Apologies for not having time to read all the messages ( I skimmed through) but I wanted to throw in my two cents. For those who don't know me, I was the author of The Art of Rigging series and since the publication of those books I've spent a lot of time thinking about modular rigging systems.

A few points:

-Python only, no MEL for reasons already discussed.

-There should be a collection of pre-requisite plugins to accompany the python scripts. In order to really get serious about this, we're going to have to extend Maya with new nodes types through C++ (not scripted plug-ins).

-The granularity of the 'modules' is a HUGE design decision. Making a 'biped' module is WAY to big. I would opt for atomic modules with generic functionality. That way, they can be aggregated into larger modules. For example, an "IKHand" module would have 5 children modules (1 for each finger). Then the IKHand has a clearly defined interface to plug into a limb (like an arm, tentacle whatever).

-Use a message connection system to create explicit connections between the components of the rig. that way, a component can truly own all it's nodes by linking with them directly. And I mean ALL it's nodes (every single DG node associated with a module is connected to the module parent node).

I have already written a VERY generic python message system inspired from the work of Justin Leach for Lucasfilm. It's a simple concept that make a world of difference. I would be glad to share this.

WHOLE THREAD

http://forums.cgsociety.org/showthread.php?p=6369554#post6369554

(Olivier Renuoard)

About the conventions :

Lots of interesting things have been said already. Too lazy to go and quote all over the place so as I remember :

eek said :

  • Ability to update and change the source, whilst content has already been made

  • Fast, simple semantic traversal of the hierarchy

  • Ability to mirror any part against any axis

  • Loading animation data on controls rather than the rig itself

  • etc

Agree totally here. One important thing I think is to clearly separate deformation rig and animation rig. Any input (animation, mocap) should go on the animation rig (controls) and deformation rig should be driven by the animation rig. Makes it easier to identify "ins" and "outs" and where animation should be exported / imported / baked eventually. In what you say do you mean animation rig is the source (or is it just animation), and deformation rig is content ?

Also conventions and definition of what a rig "part" is could probably be easier by differentiating between an animation rig "part" and deformation rig "part".

Talking only about animation rig, from the back of my head we used generic guidelines for our rib "parts" like :

  • it should provide a building procedure taking all the required hooks (where to attach itself to) as explicit arguments (never imply somewhere in your code there is a "scapula" on that specific skeleton you're building the rig on, if you need a specific part to build, it should be an argument of the build procedure)

  • it should be able to provide (either as return values of the build procedure, through methods or attribute of the class) the list of all the animation controls it creates (thus where will the input be applied)

  • it should be able to provide a list of all its outputs, and a bake procedure / methods on these outputs (what does it affect, how to recover the result animation)

We also had the added restriction (but that's more a matter of choice) that the rig had to preserve existing animation when applied, reapplying it on the inputs, either replacing it (blend) or applying offsets to it (add). Allowed our rigs to work for mocap editing as well as direct keyframe. And allowed multiple application of rig parts (rig, animated, bake, can apply a local rig "on the fly" to correct something and rebake). Note that now that Maya starts implementing animation layers there are probably better ways to do it than there used to be.

Mark said : The key to all of this isn't really the build part of the rig itself, its the core tagging systems that describe everything

Very true, having to work with different packages that use different naming conventions (Maya and Motion Builder mainly but occasionnally 3DS Max too), different clients that have their own skeleton structure (we used to also produce animation for in-game cut scene at Attitude Studio, so you have to adapt to any deformation skeleton thrown at you), first thing we did was to define a tagging system to be able to identify joints easily, and make sure all rigging tools used it exclusively. Than means no rig tools assumed naming convention or hierarchy would be this or that way, but instead all rigging procedures called the tag tools library whenever they requested a specific joint (ie "in this skeleton give me the left humerus). Tags were embedded on the Maya object making the characters (joints, specific points of interest and even geometry) and queried through this common library. Tools were used with it like a generic name parsing system, allowing to define (without writing code, with regex like definition files) and support new object naming convention and later refer to this specific convention by its name ( ie "give me the name of the left humerus in "studio X" naming convention), and "matchers" to help create correspondance between skeletons based on position / hierarchy.Note that this identification system was very much inspired from the H-Anim specifications, a specialisation of the VRML 3D standard. It defines joints but also "hooks" and "points of interest" on the character outside the skeleton (to place a specific control for instance). Also the notion of segments that they use, and its application to "segment matching" is very interesting to do meaninfgul matching of skeletons with different hierarchies (ie match "elbow" with "elbow" whatever the number of roll bones used, not just with "first joint down the shoulder").

Also agreeing with Kiaran that modules shouldn't be too big. A rig brick should be the smallest part of a rig that can work independently (a finger rig, a leg - hip to ankle- rig). Bigger rig assets could constitue of rig bricks and their relations. Actually by allowing a rig brick to be a collection of other rig bricks and their relation (and soon recursively) it would be easier to define a "hand" rig from existing "finger" rigs without having to rewrite more than necessary.

About the class approach :

I'd think that once the generic conventions about how a rigging brick should behave (take arguments, build and return results) it should be possible to define a very generic "rig part" class that implements these (and these only as not to be overly restrictive). Thinking mainly about the animation rig part here again, but by inheriting from that base class for your custom rig part class it would ensure that :

  1. you use the necessary conventions for it to be recognized by a common "rigging automaton" (the "build engine")

  2. most of the grunt work is done for you

About the UI :

I think your template idea is nice. In many cases you can generate UI dynamically from an "extended" procedure description, either by using external templates (like Prman's MTOR used SLIM for SL shaders, hopefully in a less bloated way), or even by enriching the source code with extra directives (like 3delight does using #pragma directives added in the shader SL code). It's quite easy to embedd extra info in Python code either using the docstring or specific decorators in the code itself. We're talking about two different kinds of UI there, the building UI (where you specify where to apply the rig to) and the animation UI (where you actually use the rig), that are different things and would probably need different approaches (I'd expect the specification for the animation UI to be more complex and benefit more from an external template approach, whereas he building UI can be more easily generated from the arguments the rigging class / procedure takes).

I had specced and worked on a modular / lego like rigging tool for Attitude Studio when I was head of setup / rig there. We once had to do a step by step / debug mode demo for some presentation that I still got on my site:

http://www.drone.org/demos/attitude...g/lang-pref/en/

It's sort of a toolbox / lego approach. The left side of the UI would list the available rib buidling bricks in our library. It's pretty old since it was started in our department at the time of the "Renaissance" movie, but by the end of its use we had about 3-4 different "variants" available for each brick. New ones could be added, just by using the same conventions when writing rigging procedures. Then the user would drag the parts he wanted to use (1, 2, 4 arms if needed, not necessarily using the same brick for all of them) from the left "toolbox" list to the right "used" list.

Even with MEL you could get a full humanoid animation rig auto mounted in about 30 seconds when turning verbose / debug mode off. So I don't think Python would be too slow for building tools.C++ and plugins was used for nodes that are actually part of the rig, and sometimes in great numbers (like a contraint), their evaluation time is much more critical as it directly affects the rig performance and the framerate the animators get, we used a small number of C compiled nodes. It was actually no more than 10 nodes. Like hardcoded utility (smoothstep, cosine etc) nodes to avoid using expressions, a better orient contraint that could cope with different ways of building orienting skeletons, and gave us a good quaternion based aim/twist weighted decomposition for roll bones, a hinge plane contraint to re-express elbows and knees closer to single axis rotations then mostly custom deformers.

As a final note, there are possibly things in Pymel that could be a good base for such a project. Haven't contributed to it since some time due to other assignments, but from the days I used it, the object oriented approach to Maya's own Python wrap makes for much cleaner code. The unified access to both script and API Python is nice too. Ways to pass info to and from MEL and reuse some old MEL code maybe as well. Might be a useful name parsing library for Maya there, as well as a new Matrix and Vector libs that allowed to derive both from Maya's MMatrix and a generic nxm matrix lib (where everything was done to allow an easy replacement with numpy should one want to use it for speed), meaning these object would both inherit from the generic matrix libs AND be seen as regular MMatrix objects by Maya (and thus could be passed directly to Maya functions that expect these). Such wrapping doesn't give much of a speed hit and makes it easier to adapt to different software (with different libraries) later on. Chad Dombrova the author of Pymel, and Ofer Koren who contributed most of the Pymel UI library part (and is I think mainly into character rigging himself) might have good points to add to that. I can send you mine and theirs email addresses privately if you want to contact them.

Olivier Renouard

Solution we were using is having a fixed midde ground (between animation controls and deformation rig) "animation skeleton". Final deformation only joints, skinning, secondary motion, can change depending on level of detail, shots, needs (and thus animation can be baked on it with more or less detail too for render). IK/FK joints and rig/animation controls can change depending on what specific rig is used for the shot / job at hand (classic hand, finger crawling whatever). But the middle "animation skeleton" was fixed for a specific character and production, and was the sort of "hub" where different department could exchange (animation) data, mocap, keyframe, dynamics, output to render, knowing it was a shared asset. That's where the animation would get exported from / stored in libraries / reimported. Recovering it for additional animation meant having a way to drive "back" whatever specific animation rig / controls are used for the job at hand with that animation data, but since we mostly worked with mocap, our rigs had to include this "inverse rig" part (back from standard animation skel to rig controls) anyway to allow editing (be it as a "blend" or an "offset") .

Olivier Renouard

It's about what we could came up with to ensure interoperability between different software (Maya, Motion Builder, Max) and different animation techniques (Mocap, Dynamic systems like Endorphin, Keyframe, whatever).

The easiest common ground between all these systems is an animated hierarchy (skeleton). Here again we based ourselves on the H-Anim spec (as its about the only tentative of open standard standardization for animated bipeds I know of), extended to non bipeds / humanoids as well. Rig is subject to change depending on software, studio, people, production (even sometimes on a shot basis). Deformation part also depends a lot on software (what tools are available) and usually evolves in time on productions (gets better hopefully) or depending on quality level required. We'd stick to a strict separation like :

Rig / Anim Controls <------ both ways -------> Anim Skel ----- one way ----> Deformation Rig

We never allowed anything to go direct from rig to deformation. Even if it means adding an extra 1:1 relation, the speed hit is negligible and it ensures nothing gets lost on exporting animation when rig or deformation techniques change.

Using that sort of standardized skeleton (and chosen common set of parameters, ie facial) "in the middle" allows exchange of data (most animation methods or commercial animation libraries will deliver an animated skeleton as source for the animation content) and re-usability (re use motion libraries even if rig methods and deformation rig have changed a lot since they were created). Separating this from the rig allows for rig changes, as long as there is a bi-directionnal setup between rig and anim skeleton (then again, people working with mocap are used to this constraint), and separating it from the deformation part allows the deformation techniques to evolve, and means you only need what is necessary to describe the motion accurately in the animation skeleton (you don't need to bother how many roll bones, or even if you use a real cubitus-radius twisting system in the deformation rig, as long as you have a shoulder-elbow-wrist chain with all transformations you can extract it).

The bi-directional setup means extra work though. It's pretty easy in most cases. Extrating a rotation plane and pole angle from a free rotation ball joint (mocap often didn't bother much with elbow rotation limits when exporting) to "retro drive" a IK rig is something we had from start. A good retro rig for the spine is a more tricky example. But when you get that, it means animation from any source can be fed to controls animators are used to, and edited / replaced as needed.

It has the added advantage of allowing to use a simple / pretty much universal hierarchy model for the animation skeleton. You can go wild with Maya's own stuff in the rig or deformation part (if you use Maya), like rotate axis, joint orient and other often non translatable stuff. All you need for the animation hierarchy are 3 transformation matrix:

  1. the matrix of the local space you prefer to compute transformations in (a zero position matrix)

  2. the current transformation matrix of course

  3. a physiological "rest pose" matrix.

This could be discussed further as probably people on this thread have thought this out and may have better definition of what the basis requirements are. The reasons behind the need for these extra info was :

  • Define the local space that will work best for calculation (interpolations) and representation of rotations. Many animation tools (constraints, IK systems, etc) are not totally seamless, it's important to be able to position singularities (like, but not limited to, Euler locks) where they are the less problematic.

  • Many deformation tools don't work absolutely but relatively to a rest/bind pose. Also muscular systems are easier to compute if you can work relatively to a rest/0 pose corresponding to the most relaxed state.

  • Human (and other) bodies have physiological limits, if you have to decompose quat rotations into euler angles you want to prioritize the solutions that are in the range of these limits.

Then it's easier to convert that animated hierarchy into whatever funky transformation stack your specific package uses.

[QUOTE=Drone]We had the "real" Maya hierarchy, that was handy to get controls (custom locators mostly) transforming in whatever local space you want them to, and inherit what they must, then an "abtract" hierarchy (using message attributes connections) so that we could pickwalk through them in a meaningful way (and helps building "puppet" UIs for them too). Did I get the meaning of the question correctly?

QUOTE]

---------------------------------------------------------------------------------------------

Yes thats what i thought, still i see the need to rig this control skeleton for somethings..

Quote:

Sounds handy and probably faster to parse than connections, as long as you keep ids up to date whenever you delete / add / reparent something (but problem is similar with custom connections) ?

Yes this is awesome, my format idea is very similar to what your describing, having base transforms and rest pose transforms, ability to transform it into anything you wish, with a heirarchy or not.

I was thinking today along the lines of ILMs approach to rigging with block party stuff.. The idea of building a rig, that is controlled by plugs. Nulls that drive the animation skeleton, that get simply parented into the control skeleton 'sockets', and have an id that they referrer to and probably a transforms space relative to the control (socket) driving them.

So:

deformation skeleton > animation/rig Skeleton > nulls (plugs) <> controls (sockets)

This way you can swap in and out different skeleton/rigs, for the same controls. Different skeleton/rigs plugs can also have different relative space to the control sockets, that control them.

__________________

Disclaimer: My opinions are not those of my employer.

Last edited by eek : Today at 08:43 PM.

Was just thinking about what you describe. It's sure more logical and easy to work (blend, export, import) at the top of the chain (inputs) so at the "cross" as you name it, rather than in the middle (at the animation skeleton).

However I'd see several problems :

  • The "cross" change quite a lot, depending on animation lead tastes. I don't think we could easily import control animation that was done back in 2004 on the 2009 version of the rig controls. You can setup "from control set 1 to drive control set 2" but then it becomes not much easier than designing a setup to "retro drive" controls from animation skel.

  • Some animation sources (softwre, methods) will have a hard time providing you with animation on the controls the animation lead has chosen. Especially if he doesn't want to know anything about these (mocap, dynamics). Exporting animated skeletons is much more standard, be it from MotionBuilder, Massive, Endorphin...

  • Rigs are not perfect. Making animation go through the cross->skeleton route might loose some detail / add some artefacts that were not in it in the first place (like IK pops for instance). Like with audio, when stacking filter after filter some quality can be lost. I remember our mocap department was insistent on the fact than when no correction/offset was applied the signal should remain "untouched". However now with more advanced rigs, that many problems can be solved (IK pops again can), it can be debatable.

The "standard" animation skeleton just has to be standard for your process / company (or even project if you only plan to exchange data between characters and department and not create or reuse animation in a library). So though we mainly had humanoids to do, for the non humanoids we'd define some skeleton that worked for everyone in the house: the deformation skeleton used by the skin / deformation team (and final render scenes) was usually more complex, the animation one just had what was needed to animate it and drive that final skeleton, so the animation team didn't have to cope with the whole thing (it also means having a common part in Hi and Lo LOD scenes, so you can swap assets easily without loosing animation).

Quote:

Originally Posted by eek

When you recovered animation onto the rig is is key per frame or are you actually physically rigging the controls onto the anim skeleton?

The latest. Without animation layers that meant the skeleton was actually doubled, might be possible to do something sleeker now. Note that it's iterable (was our poor man's animation layers). When applying the rig on the animation skeleton, it is duplicated as a hidden IN and visible OUT version to preserve existing animation,. While the IN_skel drives the controls (that can be used as offset to add to the movement or "blend out" to replace it) through that "inverse rig", OUT_skel is driven by the controls through the rig in the classic way. You can bake the result on OUT_skel (then IN_skel is deleted), delete the rig and reapply another one then OUT is copied to IN preserving last bake, etc. Usually done for a specific body part (ie hand requiring specific setup for some part of the shot) than the whole. Blending or offsetting was done by part also (arm, leg, hand), when controls for a specific part were at their origin / zero point and blending was 0, meant you're fully preserving the pre-existent motion (no editing at all), that part would actually "bypass" the rig totally to avoid alterating the signal. Also only part of the rig could be mounted if not all was necessary (one arm, etc) and then only that part of the anim skel would be duplicated (as long as rig is of the broken hierarchy kind it's not much of a problem).

You'd have :

Anim Skel (possibly with motion already) ---------> Deformation Skel

<< apply rig >>

(hidden) IN anim Skel (duplicated) -----> Controls -------> Anim Skel ---------> Deformation Skel

.................v................................ ..............................................^

..................|____________________ bypass _____________|

<< bake >>

Anim Skel (with baked motion) ---------> Deformation Skel

Rinse and repeat...

Bit of a blunt approach, would be curious to try and redesign that using the new Maya animation layers...

mccollom73, have you looked at AdvancedSkeleton?

It's a pretty complete free auto-rig for maya.

Besides any cool feature a rig can have I always miss tools to help the animators, like:

- Window to select controllers like abxPicker;

- Mirror (parts or whole rig);

- Button to Key all controllers;

- Button to enable/disable Smooth;

- Button to set Linear or Stepped Key in Preferences and Selected Keys;

- Button to make playblast;

- Save/Load Poses;

- Organization in scene (Names, Groups, Layers...);

Some Autorig Examples

Rigging Templates (Xeash)

Jan Berger talks about MCS