Details
This reel showcases two distinct VFX disciplines. The first section demonstrates matchmove and environment reconstruction — we tracked live-action footage in SynthEyes, calculated lens distortion and scene scaling, then brought in LiDAR scan data and aligned it with the camera solve in Blender, opening up endless possibilities for CG integration. The second section is roto and paint work for an upcoming 2026 feature film, where we replaced an actor's leg and extended his trousers with folded ends to accurately portray a paraplegic character.
The Challenge
The matchmove work required precise lens distortion calculation and accurate scene scaling to ensure the LiDAR data aligned seamlessly with the camera footage. The feature film roto demanded pixel-perfect tracking and clean paint work to convincingly replace a real limb on screen — any imperfection would immediately break the illusion.
Our Solution
For the tracking shot, we used SynthEyes to solve the camera and calculate lens distortion, then exported the scene into Blender where we aligned LiDAR point cloud data with the tracked camera, giving us a precise 3D representation of the real environment. For the feature film work, we keyed the green sock worn by the actor, tracked the footage to lock our paint work to the movement, then painted a replacement leg and extended trouser with folded ends frame by frame.
Behind the Scenes
- SynthEyes was used for camera tracking, lens distortion calculation, and scene scaling
- LiDAR scan data was imported and aligned with the camera solve in Blender
- Green sock keying provided a clean matte for the leg replacement
- The footage was tracked to ensure the painted replacement stayed locked to the actor's movement
- Trouser extension and folded ends were painted on top to complete the illusion
Credits
Let's work on something amazing together.
Bring your vision to life with cutting-edge visual effects.