Building Virtually Tracked Content
Supporting the opulent vision of Psyop's Director and Founding Partner Marco Spier and his colleagues, ERG's assignment began by providing asset optimization of geometry and texture for LED volume production, based on models provided by Psyop. "In direct collaboration with Marco, we were tasked with doing all the scene building, lighting, and final touches," Glantz began.
"We were responsible for optimizing, and in some cases, creating, the content appearing in the volume, which is the space where the virtual-tracked content is presented for motion capture. In other words, the volume is comprised of the LED screens where the Unreal Engine technology renders all the complex visual effects shots in real-time during the live-action shoot."
Drilling further into cinema magic widely hailed as the future of production, ERG's maestros began with models created in Autodesk Maya, Maxon's Cinema 4D, SideFX's Houdini, and Adobe's Substance Designer. Those assets were then integrated into the UE environment volumetrically, where optimization largely focused on designing for geometry.
According to Beery, "This project's high number of animation and lighting elements interacting together required a tight polygon budget to render everything at an acceptable frame rate. Our workflow allowed us to optimize and re-topologize these models to work well within Unreal, and required us to model each component and environment accordingly, to meet the industry standard for motion capture."
While this artistry resulted in the creation of the vibrant worlds where the campaign talents seamlessly appear in the finished brand film, there is more to know about the process of making the content interactive, to serve the filmmakers' needs during production.
Hybrid UE Workflow for Real-Time Visual Effects
Following Psyop's aim to leverage talents and technologies behind the award-winning Disney+ series "The Mandalorian," this campaign's physical production took place at LA's Nant Studios, and involved acclaimed director of photography Matthew Jensen, ASC.
As part of its pre-production services, fully understanding the potential for massively impacting in-camera virtual production scenarios, ERG created and programmed several different "baked" lighting scenarios for instant activation on the set. These addressed factors like various positions for the sun, and incorporating effects such as "lazy God rays," with appropriate shadows.
"This approach of preparing baked lighting workflows saves a huge amount of time on set," Turino Grosso explained. "In settings like this, it allows us to give the director and DP content that is consistent across all playthroughs, which can be invaluable for blocking and rehearsals."
Anticipating on-set demands for dynamic interactive lighting, Turino Grosso programmed full CUE to CUE and in-engine control via DMX. This allowed him to run both UE environmental lighting and physical lighting on the set using a GrandMA3 lighting console. The results include the ability to make on-the-fly color adjustments and vary the speed of the pinball's roll, among many others.
For ERG, Turino Grosso, Beery, and Granieri actively participated in the production, providing quality control during the rehearsals and shoot, in concert with executives from Nant Studios and Epic Games.
"Along with Virtual Production Supervisor Lawrence Jones, Marco gave us an amazing set of references and concept art that helped us understand the goal of telling a dynamic story blending classic Las Vegas with the newest technology," Kingdon added. "In this case, the use of bold physical set pieces to bend the line between photoreal and surreal elements – what we call photo-surrealism – worked like a charm. We were honored to be part of this epic, groundbreaking campaign, alongside the world's premier talents."
Images courtesy Extended Reality Group.