With nearly 300 VFX shots in production, Framestore already had a lot on its plate. Besides the Trash Mesa environments and crowds, Framestore’s Montreal facility was responsible for creating a deserted Las Vegas – designed with Syd Mead – and a glitchy, holographic shell for a computerized assistant named Joi (Ana de Armas). The idea for motion capture emerged later, almost on a whim.
“Montreal was considering options for the Trash Mesa sequence. The environments and character models were done, but the shots weren’t,” said Richard Graham, studio manager for Framestore’s Capture Lab. “They needed people to populate the wide and aerial shots they were working on. Our job was to provide realistic skeletal data and a variety of motions to the animation team so they could apply diverse crowd motion across their digital crowd. The only hiccup was, we were in London.”
Graham and Senior Mocap TD, Gerardo Corona Sorchini decided to use Vicon’s Shōgun, Epic’s Unreal Engine and a little proprietary tech to create a live link between Montreal and the Framestore Capture Lab in London. Now, anything captured by the Vicon cameras could be streamed in (almost) real-time to Framestore Montreal’s cinema room, where the supervision team was standing by waiting to offer notes or call out new moves.
The capture consisted of two mocap performers, 16 Vicon cameras and a 4 x 5 meter volume.
The data was captured in real-time using Vicon Shōgun, with the solved skeleton then streamed into Unreal Engine. The output from Unreal was then sent across the internal network using a dedicated transcontinental connection so it could be viewed in Montreal. This new setup worked like a dream allowing London to stream high-res video at 60fps in 1080p to Montreal, providing a perfect glimpse into the fidelity of each motion, with only 100 milliseconds of lag time across continents.
“Shōgun has been built from the ground up to run in real-time, supporting multiple actors and props across small or large systems,” said Tim Doubleday, entertainment product manager at Vicon. “This fits perfectly into the expanding world of virtual production where the focus is about getting live data onto film-quality assets on set in real-time.”
In Montreal, the performances were applied to digital characters that were already in the Trash Mesa CG environment. A second view was also provided, which showed live video of the actual performers, who could be communicated with via a linked audio call. Using these two feeds, project leads could make strategic decisions about movement variations that would later be cut up and delivered to the animation team for selective tweaks. After it was over, Shōgun’s processing speed allowed the London lab to get finalized data back to Montreal in under 24 hours, ensuring that the project could move forward, and Graham and Sorchini could return home at a reasonable hour.
“Vicon makes the best motion capture systems on the planet; it feels like all other companies are just playing catch-up,” added Sorchini. “With the cameras, you have a robust system that is precise and stable, while the software is presented in a really wise and clever way. Every year, it gets better and better.”
Shōgun enjoyed its inaugural run at Framestore on the
Blade Runner 2049 project, replacing their long-running Blade system. Released last April, Sh
ōgun is Vicon’s flagship entertainment software, built for the needs of high-end productions. On
Blade Runner 2049, it allowed Framestore to batch process all of their data, removing the need to convert their files, which saved the company hours of unnecessary labor.
“It’s no secret that virtual production is becoming more important to big visual effects houses like ours,” added Graham. “When you can create freely with your co-workers, even when you are across the ocean, that’s really powerful. And when you can stream film-quality mocap from that far … the sky’s the limit.”