Summary:
How I capture body and face performance alone using markerless mocap and UE monocam facial capture — then translate that performance directly into emotionally grounded layout
Why it matters:
This pipeline collapses previs → layout → performance into a single expressive loop, enabling immediate iteration, emotional clarity, and predictable directing rhythms.
Scripted Performance Markerless body and facial capture import into UE.
Blocking in game level with cross camera solution. *no lighting / raw data
This case study explores how markerless mocap and UE monocam facial capture allow me to direct performance, build layout, and shape emotional beats entirely on my own — something that once required a warehouse and a team.
I began in the ILM era of mocap — giant warehouses, dozens of cameras, entire teams.
Today, I’m capturing body and face performance alone in my garage.
This democratization of mocap is what makes real‑time directing immediate, intimate, and emotionally alive.
Narrative Alignment Framework
Beat‑Polish Clarity Map
Remote‑First Documentation Ecosystem
Captured body performance using markerless mocap
Captured facial performance using UE monocam
Retargeted data directly into MetaHuman
Built layout from my own performance
Shaped emotional beats through camera, timing, and silhouette
Iterated in real time
Instant iteration
Emotionally grounded layout
Reduced revisions
A fully democratized directing pipeline
Emotion is the fastest way to align a scene — and the most scalable.
Summary:
A fast, expressive pipeline for translating 2D illustrations into stylized 3D characters ready for real‑time performance.
Why it matters:
This workflow lets me prototype characters quickly, test them in layout, and refine emotional silhouette before committing to full production.
This case study breaks down how I translate 2D illustrations into stylized 3D characters using AI‑assisted workflows, orthographic sheets, and real‑time integration — all feeding directly into my mocap‑driven layout pipeline.
Because I can now direct performance myself, I needed a fast way to prototype characters and test them in layout.
This pipeline lets me move from concept → sculpt → performance in a single creative loop.
Beat‑Polish Clarity Map
Remote‑First Documentation Ecosystem
Artist (me) B&W character illustration
AI‑assisted stylized 3D sculpt
Fortnite‑style render
Orthographic sheets (front/side/back)
T‑pose for rigging
Integration into MetaHuman + mocap pipeline
Rapid character prototyping
Consistent stylization
Immediate performance testing
A unified pipeline from concept → layout
AI accelerates exploration — but emotion still leads.
The narrative alignment frameworks, beat‑polish clarity maps, and documentation ecosystems I built to help remote teams — and now myself — move with clarity and emotional intention.
These systems reduce chaos, accelerate iteration, and create humane creative environments.
This case study highlights the narrative alignment frameworks, beat‑polish clarity maps, and documentation ecosystems I built to help remote teams — and now myself — move with clarity, emotional intention, and predictable iteration.
These systems were originally built for teams — but they now empower my solo directing practice.
The same frameworks that aligned remote teams now align my own creative process.
Narrative Alignment Framework
Beat‑Polish Clarity Map
Remote‑First Documentation Ecosystem
Real‑Time Animation Onboarding Curriculum
Production Visibility System
Designed emotional alignment flows
Built clarity maps for beats and polish
Created intuitive documentation structures
Developed onboarding grammar for real‑time animation
Implemented dependency‑aware planning
Faster iteration
Clearer communication
Higher morale
Stronger emotional alignment
Predictable delivery
Creative systems are emotional systems — and when they’re humane, teams thrive.
If you’re exploring real‑time storytelling, performance‑driven pipelines, or creative systems that empower artists, I’d love to connect.
ARCHIVE OF EARLIER WORK
If you’d like to explore my older work from ILM, EA, Unity, and my early AI previs experiments, you can find my previous portfolio here: