By Barbara Robertson
It's too early to tell whether moviegoers will think the latest 3D animated film is as cool as the other films in this new genre of feature animations, but Ice Age could be the quirkiest. The prehistoric animated stars of this film are a moody woolly mammoth, a smooth-talking sloth, a sinister saber-toothed tiger, and an acorn-ob sessed saber-toothed squirrel. Released in mid-March, the 20th Century Fox presentation of the Blue Sky Studios film features Ray Romano as the voice of Manny the mammoth, John Leguizamo as Sid the Sloth, and Denis Leary as Diego the Saber-toothed Tiger. (Scrat, the squirrel, doesn't talk.)
Directed by Blue Sky's Chris Wedge, an Oscar winner for the studio's short animation "Bunny," the film tells the story of disparate and antisocial characters who, for reasons of their own, form a bond as they attempt to return a human child to his parents. Along the way, there's slapstick, physical comedy reminiscent of a Charlie Chaplin film or a Chuck Jones animation. But there are also darker action scenes involving the tigers that caused the MPAA to give the film a PG rating for "mild peril."
|
A new voxel-based system helped speed the raytracing of the hair strands used to create Ice Age stars Diego, Sid, and Manny, according to Maurice Van Swaaij, manager of software development.
All images trademark and copyright |
"Fox handed us the first draft of a script that was an action-adventure dramatic story and said, 'make it into a comedy,' which was no mean feat," says Wedge. "We put our heads together and it's turned out to be funny in some places and dramatic in others."
|
Manny, Diego, and Sid meet these survivalist dodo birds in a militant outpost during their journey. Like the dodos, 15 of the 30-some prehistoric characters created for the film had acting roles. |
When the studio began visual and story development on Ice Age in the spring of 1999, one of the early decisions was to make the environments simple and the characters a little odd. "I have quirkier tastes than I think the studio did and I managed to push [the design of] a couple of the characters, like Sid," Wedge says. "He looks strange and not like a sloth, but I think he's appealing and loveable." Scrat was a pure Blue Sky invention. "He has nothing to do with the story," Wedge says. "He just threads in and out as comic relief."
"We all have the Disney feel for animation-everybody in the industry does," says Carlos Saldanha, co-director, "but we wanted to add a little edge to it." Even so, to get the timing for the animals, the team of 32 animators did field research. "Scrat has a Looney Tunes feel, but we went to the park and watched squirrels to see how their tails move, how twitchy they are," Saldanha adds.
In addition to the leads, the animators worked with 15 secondary characters, including the baby and other humans, dodo birds, and rhinos as well as several animals that were created specifically for a big migration scene. All told, 30-some different types of prehistoric characters were modeled, rigged, and animated in Alias|Wavefront's Maya, where, for facial animation, the team primarily used blend shapes and sometimes a special jaw rig. "We rigged jaws with multiple joints that could be used to change how the lips move with the jaw," says Mark Piretti, lead technical animator. "The strength of a smile is about the relationship of the lip corner to the eye. The lip height controls proved helpful when there were extreme smiles and sneers, to control how far the lip corners moved with the jaw."
Also modeled and sometimes animated in Maya were a variety of environments that ranged from forest to tundra to glacier to tropical island. "It's basically a travel movie, so with rare exceptions, each sequence had its own environment," says Michael Defeo, lead supervisor. For the film's 32 sequences, the modelers worked from 3D layouts, building only what was needed for the shots in the sequence and keeping the geometry as simple as possible. "We couldn't model every crack, fissure, and bump on rock faces," Defeo says. For that, the crew relied on the renderer.
|
The human baby carried here by Sid, a prehistoric sloth, has gotten separated from his parents, so Sid and two unlikely companions-Diego, a saber-toothed tiger, and Manny, a woolly mammoth-join forces to find the child's family in this all-CG animatio |
For rendering, Blue Sky used its proprietary raytracing software, CGI Studio, which has been under development for 15 years, since the studio was founded in 1987. "We've always done everything with raytracing," says Carl Ludwig, vice president of research and development. Global illumination models were added to the software for "Bunny" [see pg. 23, February 1999]. For this film, Ludwig says the R&D crew made CGI Studio more efficient, wrote queuing software to manage the rendering process, and added implicit surface capabilities to help create effects.
"We start out with what is physically correct and then we work on ways to make it efficient," says Ludwig. "The visual aspect is more important than the physical aspect, but the physical aspect is a great starting point." On his desk, Ludwig has a collection of spheres made of different materials. "I'm always looking at these things and then looking at what I render, and I keep working at the lighting code until I'm satisfied. When you're comparing to the real world instead of an abstract world, it's hard. There's a lot of subtlety."
As a sign of how quickly the genre of 3D animated features is evolving, when Blue Sky Studios began working on Ice Age, the film would have been the first full-length animation to be rendered with a raytracer. Instead, that distinction went to Jimmy Neutron: Boy Genius, rendered with Light Wave, which preceded Ice Age by a scant three months.
"It's amazing," says Michael Reed, re search associate at Blue Sky. "For a while people were talking about how raytracing was a thing of the past; that it was not very useful. That's certainly the case for scenes with few objects. But now, scenes are getting more complex and can have hundreds of thousands of objects. When you're talking about those numbers, the speed ad vantage with scan-line renderers disappears." He also notes that with scan-line renderers, hundreds of lights may be needed to light a scene, but with CGI Studio, they can use only the light sources in the modeled environment.
Although often conveniently described as a raytracer, CGI Studio is a language as well. And the scientists at Blue Sky think of the model CGI Studio uses to replicate the physical world not as traditional graphics rendering, according to Reed, but rather as a light transport simulation. He likens the study of light transport simulations to that of studying energy propagation or heat dissipation. "That's the physics side," he says, "figuring out how light moves in a scene." Reed explains that to get a complete model of light you can pair radiosity, which models diffuse light transport, with raytracing, which models specular light transport, but the combination is cumbersome. "If you combine them in an obvious way you end up with a system impossible to use because it's too slow," he says. "The difficult part is in figuring out the light distribution functions. That's where these light transport methods come into play."
|
The footprints were created without displacing surface geometry. Instead, Blue Sky's raytracing software, CGI Studio, calculated where a camera or shadow ray would intersect an implicit surface. |
For his part, Reed worked on methods for creating implicit surfaces in CGI Studio. "With implicit surfaces, you can generate an unbelievable amount of detail without a memory penalty," he says, "but I don't think they've been used much in production because they've been slow and difficult to control." The effect is similar to that of a displacement shader in which surface geometry is offset-based, typically, on shades of gray in a painted texture map. Implicit surfaces in CGI Studio are used for the same reason as displacement shaders-to create detailed surfaces without having to model them. And, they can be defined in the same way-with gray-scale maps or procedurally. The difference lies in how the surface is generated. "Instead of chopping the surface into tiny pieces and moving the pieces where they would be offset by the map [as with a displacement shader], you figure out where a ray would intersect the surface, but you don't create the surface explicitly," explains Reed. "Because we don't create geometry, the memory utilization is almost zero."
The reason the technique isn't used widely in production, he explains, is because typically to compute intersections and create implicit surfaces, several calculations are done as the ray steps along looking for the intersection point. "It's like a distance measurement," Reed says. "You take little steps in space and compute the result of a math function at each step. When the result is zero, you're on the surface. But to render a surface with high detail, the steps have to be small." Analyzing special cases, such as when maps are used to define the implicit surface, helped make the technique efficient, as did more typical acceleration methods such as bounding boxes and adapting step sizes to the distance from the camera.
One application of the implicit surface technique was for a scene during which the characters ski through a giant snow-covered field. "We see a close-up of the snow being tossed up and the ground deforming," Reed says. "As the camera pulls back, we need to keep the same amount of detail. People often handle this with separate levels of detail, but with the implicit surface technique, we can use more rays for the areas close to the camera and fewer for the areas farther away."
Another application was to create tracks in the snow. Senior technical director Robert Cavaleri explains: "You have a character that you know will be walking on snow, so rather than animate him walking on the surface, you put his feet down two or three inches below the ground." The team created software that examined the animation and created a black-and-white map to distinguish where the surface was and wasn't depressed; shades of gray defined the depth of each footstep. CGI Studio then used this map to create the implicit surface as the scene was rendered. "We also used this technique to create a disrupted surface during a sequence in which the saber-toothed tigers are fighting in the snow," Cavaleri says. "As the tigers roll around, the implicit surface technique creates indentations in the snow." And then to complete the effect, the team placed particle systems [in Maya] that automatically kicked up bursts of snow when a foot intersected the ground.
For the falling snow, using a technique developed by technical director Keith Klohn, the team painted snowflake maps in Adobe Systems' Photoshop, applied the maps to particles in Maya, confined the particle simulation to a box that could be positioned in 3D space, ran the simulation long enough to generate sufficient frames to handle the length of a sequence, and then placed the "snow" box into a scene. "The box could be scaled and rotated into the position the art director wanted," says Cavaleri.
The effects team also placed and generated as many as 50,000 animated trees on the side of a mountain, created waterfalls, rivers, and mud pits, and produced avalanches, geysers, and lava flows.
|
The lead characters travel through nearly 30 different landscapes while looking for a human baby's family. The effects team created snowfall, placed trees, and added such elements as dust using proprietary tools and Maya particles. |
"In the beginning of production, we held back on some of the effects work," says Ludwig. "We budgeted 15 hours per frame for rendering, but we increased our efficiency tre mendously. Halfway through, we'd gotten it to 7.5 hours per frame. So, we opened the spigot a little and added more." Blue Sky's render farm consists of 512 DS10/DS10L Compaq alpha computers; however, the queuing system, developed under production programmer Dan Weeks' direction, also took advantage of SGI desktop machines (mostly Octanes), a 16-processor SGI Origin, a 4-processor SGI Onyx, and other single and multi-processor SGI machines. The system, called RUSH, managed batch renders, particle renders and simulations, could render frames with multiple resolutions within a frame, and helped with interactive lighting by using multiple machines simultaneously via a new program called Quick Render.
|
|
In the past, TDs used a text editor to place lights. Even though few lights are used in the scenes-usually four to eight and rarely more than ten-the new program still helped them save time. "If you can save a TD 30 seconds or a minute 50 times a day, it adds up, and when you consider the whole lighting department, it helps the overall efficiency," says Joe Higham, manager of software tools. "Because QuickRender can utilize multiple processors (accessed through the queuing system), TDs can now get immediate renders to help them visualize the placement of lights to achieve effects."
"The hallmark of our effects work was in the big bridge sequence," says Cavaleri. In this sequence, the lead characters are walking across a frozen landscape when suddenly the ground gives way and several geysers burst through. The characters hang onto pillars of ice as lava flows through a canyon beneath them.
The geysers were created with Maya particles by Tim Speltz, technical director, but the lava was created with a custom software program called Swirl2D written by senior research associate John Turner, now at Los Alamos. "The software examined the topology of a surface, and then allowed a map to be distorted using physical fluid flow calculations," ex plains Cavaleri. The temperatures of the hot surface and icy walls governed the shape and the speed of the flow. To soften the harsh edges, the effects team used particles to emit steam where lava met ice-as pieces of ice fell into the lava and along the walls-and a volume shader to form a dense haze slightly above the lava surface.
To develop the look of the lava, technical director Eric Maurer picked a new Blue Sky program called convertShader to put colors and textures on the surface. With convertShader, the team could interactively create and see materials that would be rendered in CGI Studio by using Maya as a front end. "It opened up a bottleneck we've had for a while," says Trevor Thomson, a senior programmer. "Before, you had to be a programmer to create procedural textures." Thus, convertShader not only made life easier for the technically oriented effects team, it opened up CGI Studio to artists.
Not all effects, however, were created by the effects team. Some were added in compositing. Although the bulk of the compositing was handled by TDs using Nothing Real's Shake, for some scenes, digital paint artist John Siczewicz preferred Discreet's Inferno. For example, in one sequence, when Sid is trying to start a fire by rubbing sticks together, he added rain and hail using Inferno's particle systems. For another, he zapped Scrat with lightning and turned the squirrel into a glowing skeleton. "A lot of the time, though, my area would clean up elements and put them back into the show using Shake or Inferno," he says. "We've gone over the film with a fine-tooth comb."
Now, people at Blue Sky are putting what they've learned from Ice Age toward what they hope will be their second film, and devising even more efficient methods of creating and rendering the graphics. "In this movie, we started to add richness as we got deeper into it," Ludwig says. "We'll carry it further in the next project." The next project, which has already been under development for two years, is based on Blue Sky's own script. Its viability hinges on the success of Ice Age-a necessity Wedge hopes lessens with time.
"I'd love to get to a point where we can make these movies more like independent filmmakers do, with a smallish amount of money," says Wedge, "and see if we can make something that is more...more..."
"Like 'Bunny?' " we ask.
"Like 'Bunny,' " he answers. "Exactly."
Barbara Robertson is Senior Editor, West Coast for Computer Graphics World.