I broke into physical production on a brilliant (read: indie) feature called Milk and subsequently landed work in television, documentary and commercial production, yet my visualization exposure was generally limited to seeing some storyboards. In the theater we called visualization “rehearsal.”
It wouldn’t be until I landed my first big budget CG feature that I would hear the word “Previs” for the first time.
That picture was John Carter, then,
of Mars. I was the producer’s assistant so I had the privilege to spend a lot of time with the creatives during development, especially with the previs team: a tidy group of four, led by their supervisor Daniel Gregoire. I would take notes during the director and art department reviews, chase them for cuts, break it all down, the usual. I was immediately struck by the impression that such a small team of artists was contributing to the authorship of the film.
Our director, Andrew Stanton, was the ninth Pixar employee ever so the animation studio — along with its culture, pace and resources — had always been his home. This being his first live-action film, he was always quick to note the subtle (and not so subtle) differences between the animated life and the live-action world. I remember him describing live-action filmmakers as a band of gypsies — a collection of ridiculously talented people that show up, lay out their wares, put on a magic show, then pack up their tent and move on to the next city.
To me, the previs team was the ultimate personification of that ethos. It was impressive the creative ways they found to feed an unquenchable creative process, not to mention the pace at which they turned things around. The keys were simple: Creativity and speed. It wasn’t until later that I realized that the core from which that successful process orbits, is technology.
Cut to our next show, World War Z, and I have the privilege of again working with Dan, only this time he’s flying solo while we’re on various locations based in the UK. The rest of his team worked remotely from his studio in Santa Monica, CA.
I helped Dan set-up their motion capture volume on several occasions, wherever we could squirrel away some space: An abandoned (and probably haunted) back dwelling at Longcross Studios, a ballroom at the Westin in Malta, running VCam sessions while quarantined behind plastic in the middle of the Fantasy Studios model shop. Wherever, whatever, the teams were always effective turning around a cut that would feed everyone, keeping the machine in motion, winning days.
Because Dan was solo on location, he was always well equipped to make sure he was getting the coverage he needed, including wearing GoPro’s mounted atop a faceless biker’s helmet during location scouts. All the department heads made jokes about it all day long, but when the sun rose the next morning, you better believe it was Dan that everyone was going to for answers.
In 2015, Dan reached out to me about an opportunity to produce in-game cinematics. I was excited to join the team and get back behind the curtain and see what the studio was up to. When I arrived, there was a VR project in production for the Smithsonian Institute. I put down my backpack and threw on the goggles to immerse in my first full 360 virtual environment experience. It opened my eyes to this completely new storytelling landscape and I had just been in the door for just five minutes.
It seemed light years from where we were in Malta — the difference, it turned out, was game engines.
Dan had started in games, so utilizing a game engine for visualization had been on his radar from the very beginning. He knew it was cost prohibitive at the time, but if anyone had the understanding and the means, it would be George Lucas.
“I pitched the idea back when we were collaborating onStar Wars: Episode II — Attack of the Clones,” Dan recalls. “He was receptive to the idea, but the technology was so crazy expensive that they couldn’t justify the development costs. Instead, we concentrated on pushing those early Lucas videomatics into what would become a new digital workflow.”
When Unreal Engine 4.0 was released to the masses in September of 2014, Dan pounced on the opportunity. Though the real driving force at the time was in the virtual reality space, when I arrived the next summer Unreal was already being used across multiple project types at the studio — video game cinematics, amusement park ride experiences, while just entering the nascent stages of being utilized on our feature film projects.
We’ve been using it ever since. The incorporation of game engines into our visualization pipelines has allowed us to achieve greater visual fidelity, accuracy and speed than ever before. Filmmakers who are new to the virtual space can take an iPad, and with a few clicks of the controllers, capture a whole selection of shots that can be immediately fed to editorial.
Rendering through Unreal allows everyone to see everything in more of a finals-like way earlier in the process. Working in real-time allows previs to remain in editorial longer through post, up to, and including screenings for studio executives, and ultimately civilian test screenings without Maya playblasts scattered amongst WIP VFX thus ripping the audience out of the story.
Virtual art departments are becoming more ubiquitous as well. On-set tools are allowing filmmakers to previsualize shots closer to final than ever before. You can swap assets, effects, characters and environments all per request, on the day. LED screens, virtual cameras, photogrammetry, motion-control, simulcam — director’s have more control without being hindered by technical limitations allowing them to focus on telling the story they want to tell.
Game engines continue to reveal new efficiencies outside of the traditional VFX pipeline. As more studios embrace the technology, assets can also be utilized concurrently across the transmedia spectrum: Publicity and marketing, video game tie-ins, Web content and every subsequent franchise spin-off.
Ultimately, the technological trends are dependent on the artists and filmmakers themselves. You can have all the amazing toys at your disposal, but if you don’t have the leadership and the right team to collaborate, create and execute it, the toys won’t matter. In the end, it takes a human touch to make the magic trick happen.
Ask Chris Ferriter, our executive producer & CEO, who stresses that it’s not about following trends. “It’s all about giving filmmakers the tools to tell a better story,” he says. “If a piece of technology or a new technique doesn’t increase quality or help control cost, it probably won’t see widespread adoption.”
Richard Enriquez is a producer at Halon Entertainment in Santa Monica, CA.