Steven Spielberg could have invaded the planet with a spectacular battalion of transparent, high-tech spaceships. He didn’t. Rather, for his 21st-century cinematic version of H.G. Wells’ 1898 novel
The War of the Worlds, he focused on one man’s desperate attempt to protect his family from a terrifying attack by mysterious, relentless, and insuperable forces that arise from within the bowels of the Earth.
The Paramount Pictures/DreamWorks film stars Tom Cruise as a New Jersey dockworker who’s a deadbeat dad with two children. “It’s told from Tom’s point of view,” says Industrial Light & Magic’s Dennis Muren, co-visual effects supervisor along with Pablo Helman. “He goes through a war experience in his own backyard. It’s dirty, violent, painful, unexpected. And he’s in fear.”
Muren continues: “I was thinking we’d do something futuristic like the [1953] movie
The War of the Worlds-something slick and contemporary. But, Steven had more of a Private Ryan kind of storytelling in mind. Everything is dirty and dusty. Things look ‘silhouettey’ and sketchy. It’s not quite clear what you’re seeing. The camera moves look unplanned, like the amateur videos from 9/11.”
What is clear, though, is the devastation, which ILM created, along with the all-CG aliens and their ships, in record time. Given its compressed postproduction schedule of only 12 weeks, the crew-which began as a core group of 50 and grew to 179 during the last five weeks of post-could have chosen to work with its familiar, time-tested pipeline. It didn’t. Instead, the team ran the production through ILM’s new pipeline, a tool set called Zeno (see “From A to Zeno,” pg. 16).
The destruction in War of the Worlds was accomplished with miniature photography and CG elements composited into atmospheric live-action plates filled with mist, dust, and smoke.
“
War of the Worlds was our test bed,” says Muren, who had spent 10 months after
The Hulk helping devise interface standards for the new tool set, elements of which had been in development at the studio since 1998. His goal: Empower the artists, and make it possible for them to easily use a variety of tools so they have the opportunity to do practically anything they want to do. “We had TDs who did composites, and model makers who did [texture] painting,” says Muren. “If animators wanted to do rendering, they could. We instigated the philosophy that if people wanted to learn, we were open to it.”
In the dramatic sequence during which the tripods (the aliens’ ships) first appear, compositors worked with Zeno in addition to their usual compositing tools: Apple’s Shake, ILM’s Saber (based on Autodesk Media and Entertainment’s Inferno), and ILM’s CompTime. The sequence takes place at an intersection crowded with people and cars. The huge alien vehicles emerge from underground, twisting everything on the surface as they rotate into view, buckling the pavement, moving buildings, causing the front of a church to shear off.
“From a compositing standpoint, this film was the most complex I’ve ever seen,” says compositing supervisor Marshall Krasser, who has worked on more than 20 films at ILM. “Steven likes to set the mood with atmosphere, so the plates have dust, mist, smoke, and debris, and we have to match the plate.” To match these atmospheric elements, the compositors used images that were either shot for this film or found in their database of photographic elements.
The sequence was crafted with a mixture of effects-from particle simulations for the cracking pavement to practical elements-often with the help of Zenviro, the camera projection module in Zeno. “You see a pickup truck that gets rotated around,” says Krasser. “We lifted it out of the plate, took it into the 3D realm, projected it onto 3D geometry in Zenviro, animated the geometry to rotate it, and composited it back into the plate. We didn’t have to go to a modeler or TD or renderer.” In the same way, the compositors removed other elements that would move during the earthquake-like sequence-telephone poles, stoplights, stop signs, and so forth-put them onto separate pieces of geometry, and animated them. “It’s surprising how much that extra effort helps sell a shot,” says Krasser.
Using atmospheric effects, ILM compositors set the stage for the alien attack by combining a dark, forboding sky with miniatures and live-action photography.
Similarly, to destroy buildings in the location shot, the compositors used live-action footage and photographs from the location shots and from the shots of miniatures, and projected them onto moveable 3D geometry. “Sometimes we would have multiple versions of the same building with more damage as the shot went along,” says Krasser. “We’d add practical elements, like bricks falling off the buildings, to augment the damage.”
The 3D elements were rendered and moved into ILM’s CompTime-something the group has done in the past, though not as easily and efficiently as now.
Zenviro made it possible to affect the photographic plates as if they were elements in 3D scenes; on the other hand, the huge tripods were fully 3D. With their spindly legs fully extended, the mysterious creature-like machines towered 140 feet tall. “The heads were almost military like, and had phosphorescent lights and vents that expelled gasses,” says Randy Dutra, animation director. “It was like a tank perched atop delicate long, graceful legs.”
Nineteen tentacles extended from the tripod’s head, including a snaky one-eyed probe, grabbers that drew people up into the head, and bloodsuckers. “One tentacle held a person down while another opened its three prongs and shot a long dagger down from the middle and into the body,” describes Dutra. The hero tentacles were handled by the animators, and the background tentacles were simulated in groups. “It was a creepy Medusa-like effect,” Dutra says. “You couldn’t tell what they were doing until something happened. Steven set up the sequences so the audience is not given all the information at once.”
Spielberg also left unclear whether the tripod and its probes were mechanical or organic. “The tripods looked dirty and evil in a way that reflected the tone of the movie,” says Muren, “kind of organic in places and, sometimes, maybe parts of it aren’t. We walked a fine line. We didn’t want to confuse the audience, yet we wanted to keep things interesting.” The one-eyed probe tentacle, for example, slithered like an eel as it traveled through a cellar, but it looked mechanical.
Top left to bottom: The live-action plate was shot like an amateur video, then match-moved at ILM, where twisted miniature models and CG elements replaced the freeway and added debris. CG objects and compositing tricks created a dangerous, evil mood for f
“It was always hovering or moving,” says Dutra. “Whether it seemed organic or mechanical depended on how we broke up the intervals of movement. Repeated, metered action was mechanical. Organic action would be more unexpected.”
Modelers built the tripods and the aliens in Maya using subdivision surfaces. The animators worked in Maya and then transferred the animation to Zeno. “Before, the animation was cached; now TDs can tweak the animation in Zeno,” says Curt Miyashiro, digital production supervisor. “Before if we had wanted to change the headlights on the tripod, we would have had to go back to the animators or go into Maya. [With this film] we could change the direction of the headlights in Zeno while we were working on lighting the shot.”
To make the headlights undulate, the compositors, working in Zeno, applied 2D “tricks” directly to the animation. “For the headlights, I created a 2D fractal noise pattern and quarter-pinned it to the tripod head using animation data,” says Krasser. Because each corner of the image with the fractal noise pattern was animated and locked to the tripod, it moved with the tripod in 3D space and stayed properly positioned relative to the camera. For the windows, he tagged points along the tripod head’s geometry, used 2D fractal noise to generate mattes, and then cross-dissolved between color variations. The result filled the windows with moving colors, like fluid stained glass-organic, but alien.
Like the aliens’ vehicle (the tripod), this probe, one of its 19 tentacles, was designed, lit, and animated as a mysterious cross between something organic and mechanical.
“It’s a form of 3D compositing,” says Krasser. “Why write a shader when you can use other techniques to get something you haven’t seen before?”
Like the tripods, the aliens also had three legs-actually, two long arms and one leg. “They harm earthlings, but they’re also very curious,” says Dutra. “They’re survivalists studying this planet, trying to figure it out. Killing was a fact of their survival.” For animation reference, Dutra chose red-eyed tree frogs. That’s because he wanted the aliens to stay close to the ground, but plant their hands on the walls, rather than on the floor, to move.
To place texture maps onto the aliens’ subdivision surfaces, the crew developed sub-object texturing. “We partitioned the creature; otherwise, we would have had only one large map to paint,” says Miyashiro. Each creature had RGBa color maps, as well as maps for specular lights, to determine skin thickness for subsurface scattering to tell the shaders how to change as they receive light, and so forth.
The creatures, too, were mysterious. “You see them, but you never get a clear look,” says Krasser. “So many times with visual effects you hit people over the head with ‘look at me’ stuff, while here you had to struggle to see the aliens.”
For the technical directors, the mysterious look was welcome. “It was great to have a creature move through pools of light,” says Michael DiComo, digital production supervisor. “When we have cues that we can use, like three bands of light falling across something in the plate, it makes our work look more real.” To place the lights, the TDs used Lux, the new lighting module in Zeno. According to DiComo, Lux is true 3D lighting for particles and creatures that brings match-moving, interactive lighting, and texture painting together. The shots were later rendered with Pixar’s RenderMan and Mental Images’ Mental Ray.
“The area we pushed most in this film was the aesthetic,” says Muren. “Especially in the CG world, the look is imposed by the technology. We put an effort into making the effects look organic and real, and that’s difficult to do. But when you have artists who understand when I say, ‘It can’t look too clean, too computer graphic, as well as a tool set that’s easy to use, they can do the shot.”
From top left: Compositors lifted elements from the film plate, projected them onto geometry, and moved the geometry. Photos of a damaged miniature let them sheer the church front. Simulation software buckled the pavement. The final frame is at bottom.
Seven years in the making, Zeno, ILM’s new tool set, has redefined the studio’s pipeline, opened the production process to all the artists on the crew, and positioned the studio to create future forms of entertainment.
“Our first goal was to manage large scenes,” says Cliff Plumer, chief technology officer. “It was driven by the pod race in
Star Wars: Episode I.” Thus, at its core, Zeno manages scene data. “The old pipeline used the old Softimage scene file,” he says. “Building our own gave us control.”
And with that control came new tools, more facile integration of commercial tools, a consistent user interface, and a structure that handles multiple artists working on multiple versions of elements within a scene. It’s a 64-bit system with support for multiprocessor memory and the ability to talk directly to Nvidia cards. “It’s a fundamental tool set, a timeline, a scene graph, a curve editor,” says Plumer. “It’s all about work flow. Zeno loads the tools you need. Instead of calling a modeler, an artist has access to modeling tools, and a modeler can pull up lights or add a simulation and see the impact on the model.”
The first application on Zeno handled camera tracking. That module won an Academy Scientific and Technical Award. A core engine built into Zeno now handles all simulation tasks-hair, skin, cloth, and fluids. Other modules include those for sculpting, facial animation, particles, rotoscoping, match-moving, painting, and so forth, the studio’s new camera-mapping software (Zenviro), and new lighting tool (Lux). Support for Python, a scripting language, lets people write their own modules.
“Before, we had dozens of programs and would have to convert data between them,” Plumer says. “Now we have a common interface.” Built into Zeno are live links to Alias’s Maya and Adobe’s Photoshop. In fact, to ease the artists’ learning curve, Zeno’s interface mimics that of Maya, especially the outliner, making it easier for the company to hire people with experience.
Zeno also links to such compositing programs as Apple’s Shake and Autodesk Media and Entertainment’s Inferno. Output goes to Mental Images’ Mental Ray and Pixar’s RenderMan for rendering, and there’s a pipe back from the renderer to the compositing module.
Under the hood, a sophisticated system makes it possible for any artist to use any tool by splitting a scene into files yet keeping everything together. “We needed to have a shot look unified, but on the other hand, we needed to be able to throw 10 people at it and get it done,” says researcher Alan Trombla, himself a former technical director, and one of the “brains” behind Zenviro. “Splitting is conceptually simple-you can separate paint from geometry-but there are relationships between this paint and this geometry,” he notes.
The solution? “We organized things by the people responsible for them,” says Trombla. “And then we implemented nondestructive override. If a technical director makes a change, that change goes into the TD’s file. If an animator makes a change, the change goes in the animator’s file.” Trombla provides a simple example from
War of the Worlds: A technical director needed to add a bullet hole to the tripod’s texture map. “Before, it would have gone back to a painter,” he notes. “Instead, the TD painted it, and the bullet hole was layered on top. Later, the painters added a reddish tint to the model. Even though the color changed beneath, when the TD looked at the tripod, the bullet hole was still there.” The same layering principle applies throughout the pipeline.
Most recently, Trombla worked on the lighting module, Lux. By managing all the passes and dependencies, Lux makes it possible for TDs to tweak individual pieces in a scene without worrying whether changing the dirt map, for example, will affect the specular map, which affects the beauty pass. “Artists shouldn’t have to be bookkeepers,” says visual effects supervisor Dennis Muren, who set the interface standards.
Now, Zeno is stretching beyond ILM. When the company moves to Lucas’s new Letterman Digital Arts Center in San Francisco’s Presidio this month, LucasArts and ILM will be on the same pipeline. “We’ve been working on collaborative tools with LucasArts for 18 months,” says Plumer. “Zed, which is LucasArts’ game engine, will have a live connection to Zeno. “We can edit assets in Zeno, drop them into Zed, edit them in Zed, and update them in Zeno.”
And that means George Lucas will have in place a 21st-century pipeline tuned for both game development and visual effects.
“The goal,” Plumer says, “is real time.” Think about the possibilities....
- BR
Barbara Robertson is an award-winning journalist and a contributing editor for
Computer Graphics World.