Mold3D Studio’s real-time technology experience
Quintero formed Mold3D back in 2016 after he became interested in using real-time technology, and realized how it could transform the world of content creation. At the time, he was collaborating with Epic, working with Unreal Engine on projects such as Paragon and Robo Recall.
“Real-time technology struck a chord with me, and I thought that's where I should focus my energy,” he says. “I felt like it was the future, because I was able to visualize 3D art in real time, instead of the days and weeks that traditional rendering required.”
With some Unreal Engine experience now under his belt, Quintero joined Fox VFX Lab, where he was asked to head up their new VAD (virtual art department) and build a team. In these early days of virtual production, Quintero was using Unreal Engine to create pitches for films and to visualize environments for Quintero directors, enabling them to do virtual scouting, and to set up shots, color, and lighting that were then fed to the visual effects vendor where they would finish the film.
After his time at Fox VFX Lab, Quintero and his team were asked to be a part of the VAD for
The Mandalorian. “It was the foundation of us starting up a studio solely devoted to the art of being a real real-time studio,” he says. “I was trying to build for what I saw that was coming—the future of visual effects. We could all feel that this was happening.”
Shortly thereafter, Mold3D Studio was invited back to join the VAD for
The Mandalorian Season 2. Around this time, the studio was also approached by Epic to work on the Unreal Engine 5 reveal demo. They put the experience gained on previous projects like The Mandalorian to good use when tasked with creating extremely complex and high-resolution 3D models to show off Nanite, UE5’s virtualized micropolygon geometry system.
“It was exciting to get a taste of what’s coming in UE5 in the future,” says Quintero. “We’re currently using UE4, until UE5 is production-ready. There have been some great advances in the latest UE 4.27 release—especially in the realm of virtual production—but features like Nanite and Lumen are really going to change the game.”
Virtual production techniques help Mold3D Studio create Slay in a pandemic
After the UE5 Demo wrapped, Quintero began talking to Epic about
Slay. The proposal was to create a finished piece of final-pixel animated content in Unreal Engine. With the company now starting to get a name for environment art, they were excited to illustrate their expertise in story development and character design. With the exception of Windwalker Echo, the
Slay assets, including her adversary, were all designed and created by Mold3D.
Just as
Slay was getting greenlit, the pandemic hit. Quintero and his team set up a remote working environment that would enable them to work on real-time rendered animated content, as well as other projects that they had on their books.
“We quickly created a way to enable our company to work remotely on an animated short by making our pipeline virtual,” said Quintero.
Interestingly, it was virtual production techniques that ended up making this all possible. With the mocap happening in Las Vegas, Quintero’s team in Burbank directed the actors via Zoom, while viewing the results on the characters in real time in Unreal Engine, making it easy to ensure they had the takes they wanted.
“Although we probably would have done a lot of things the same way we had if there was no pandemic, we were thankfully able to rely on the virtual production aspect of the filmmaking to save the day,” says Quintero.
After the main motion was captured, the team did a second session with the actor just for facial capture. For this, they used the Live Link Face iOS app.
“We were able to look at her takes with the recording that came out of the iPhone and also, on the day, we could see the camera looking at her,” says Quintero.