This week, I made an interactive project uses ml5.js FaceMesh and p5.js to create an expressive real-time particle portrait. The system reacts to facial gestures, such as smiling, blinking, opening the mouth, and tilting the head . It trasforms a pixel-based visual field and triggers different unique visual effects, turning simple face motion into dynamic, poetic motion graphics.
Smile → Color Transition:
The particles shift between cold and warm tones based on the smile ratio.
Smiling more brings out warm colors; smiling less returns to cooler tones.
Open Mouth → Particle Blow:
When the mouth opens, particles near the mouth are “blown away” and then gently drift back, as if responding to a breath of air.
Blink → Particle Size Change:
The distance between eye landmarks controls the overall particle size.
When blinking, the particles shrink slightly, creating a breathing rhythm.
Head Tilt → Canvas Rotation Effect:
By detecting the relative position of both ears, the canvas appears to tilt dynamically, responding to head movement.
I started from a simple sample code that adopted ml5 face model and mapped video pixels into small moving particles. From there, I gradually integrated FaceMesh landmark tracking, experimenting with different facial features as data inputs. I found the right balance between responsiveness and stability so that movements didn’t look chaotic. Also, I successfully implemented the “blow” effect, which I tested several force and decay models until I achieved a natural motion where particles disperse and then softly return. Moreover, I also refined the tilting and blinking function.
Through this project, I learned how to:
Use ml5 FaceMesh for real-time landmark tracking and extract meaningful facial ratios.
Combine physics-inspired motion (velocity, damping, return force) with creative visuals.
Build a multi-parameter reactive system where each gesture controls a distinct visual layer.
Next Steps / Future Ideas
Add sound reactivity to connect breathing or smiling with ambient audio.
Introduce depth-based layering to make particles move in 3D space.
Experiment with emotion recognition and generate abstract visuals that reflect mood changes over time.