The Great Gatsby is not a film filled with explosions and wild visual effects. But, that does not mean that previs didn’t play a vital role in bringing Baz Luhrmann’s flashy vision to theaters.
The Great Gatsby follows F. Scott Fitzgerald's story about Nick Carraway, who arrives in New York and encounters the mysterious Jay Gatsby, whose illusory wealth and extravagant parties are but a sad smoke screen for luring back his lost love Daisy from her home across the bay, where she's married to Nick's friend, Tom. Previs Specialist Cameron Sonerson was tasked with helping Luhrmann plan out a number of sequences, including one involving the New York High Line whereby Gatsby and Nick drive from Gatsby's mansion to a speakeasy downtown in this period film.
Sonerson and his team used aerial maps of New York to plan out the locations, some of which are fictitious. Using sets delivered from the art department, he built the stretch of road covering the Castle, the Valley, and the Queensboro Bridge, while modeling generic, High Line sections, which he rigged to a curve and repeated as necessary. Then he modeled the shoot locations in Sydney and placed them seamlessly into his virtual New York set. At this point, Baz could see the New York skyline in the background when he framed in and around the Valley of Ashes set constructed in Sydney.
Using Autodesk Maya, Sonerson built and rigged low-polygon models of the actors, along with the yellow Duesy car and the other vehicles as it slaloms and swerves down the road. Sonerson explains. "We hand-keyed some generic car swerves and used driving simulators in Maya, and then applied them to the car rig. Then we placed parked and moving traffic timed for the Duesy to swerve in and out of. We used expression-driven animation on the characters to provide automated lean and tilt whenever the car swerves."
A table read of the script was then cut into the edit. "We put a bunch of coverage cameras on the driving animation to frame the characters when we needed a close-up during the dialog," Sonerson explains.
After the group we had a basic edit of the story, they dressed the scene with buildings, added atmospheric effects, sparks when the car bottoms out, taillights breaking, exhaust fumes, and so forth. "We put trains on the High Line, added police officers and people jumping out of the way. We had exterior insert shots of the Duesy, showing close-ups of the tire and grill, as well as static camera drive-bys," adds Sonerson
Like his fellow previs artists, Sonerson stresses the need to keep things simple: "When your Maya scene is too heavy to tumble, you have problems. I'll use texture cards and other tricks to keep a scene optimized and geometrically light, while still maintaining the look the director wants. Reflections are always difficult to achieve, but if required, I'll use HDRI shaders or inverse camera setups to achieve a reflected look in a mirror, for example."
Instead of using particles for effects, which often require taxing dynamic simulations, Sonerson relies on element libraries - for sparks, fire, lightning, and so forth - placing them on cards to achieve a photoreal look without bogging down the viewport. While lighting isn't often necessary in previs, Sonerson says, "for Gatsby, we added lighting and effects to suggest the mood of the scene."
When the entire sequence was sketched out, Sonerson presented it to Luhrmann, who, studying it closely, began blocking out his shots. "We needed to break down each shot for the live-action shoot," says Sonerson. "There were two main setups for the shoot: the static car set on bluescreen for the actors and the stunt-driving location set for the exterior shots." After modeling the bluescreen set, Sonerson inversed the camera animation onto the static car set so it played on the bluescreen. Then, for the location shoot, he did layouts according to Luhrmann's shot designs for the stunt team and crew to follow.
In case of any miscommunication or uncertainty on set, Sonerson and his team set up an office next to the soundstages. Real-time durvis capabilities were essential to Luhrmann, much like to del Toro. Indeed, both directors used a similar, iPad augmented-reality system to composite live-action and previs in real time. Using compass-based iPad software and proprietary Maya tools, Sonerson incorporated virtual views from his previs locations into the frame, helping the director, the DP, and crew see what they were shooting. "This would allow them to have a real-time view of what would be beyond the bluescreen, allowing them to frame accordingly. If they were shooting on location, we could update the iPad remotely to add more environments and viewpoints in a fast turnaround," says Sonerson."