Open Before Christmas
Issue: Volume: 30 Issue: 12 (Dec. 2007)

Open Before Christmas

The winter holidays bring nearly as much festive cheer to movie-theater box offices as do the summer blockbusters, and this year is no exception. In addition to the two big visual effects films—Beowulf and The Golden Compass—a trio of smaller movies is also vying for a share of the season’s good ticket sales.

I Am Legend, Fred Claus, and Enchanted, three films produced in three different styles, have one thing in common: photorealistic CG animals. In Enchanted, a little CG chipmunk nearly steals the show, CG reindeer pull a sleigh for Santa’s big brother in Fred Claus, and CG deer and lions roam the streets of a post-apocalyptic New York City in I Am Legend.

From stylized digital humans, to polar bears that talk, to chipmunks that pantomime, the CG elements in this season’s films carry story points, create emotions, and, most of all, entertain us. As movie critic Roger Ebert puts it: “First we get animation based on reality (Beowulf), and now reality based on animation (Enchanted).” And without computer graphics, none of it would have been possible. Enjoy!

TOONING REALITY
Tippett Studio populates a photorealistic fairy tale with furry CG animals that sometimes behave like cartoons.

Disney’s Enchanted begins as a classic 2D animated fairy tale, but the handsome prince’s wicked mother pushes his beautiful bride into a well just before the wedding, and that begins the story. When Giselle emerges, she’s actress Amy Adams and she’s in Times Square. Giselle doesn’t know she’s no longer a cartoon, so when she leans out an apartment window and sings her “happy working song,” it makes sense for rats, pigeons, and cockroaches to show up and merrily whistle while they work.

Kevin Lima, who directed Disney’s animated feature Tarzan and live-action feature 102 Dalmations, masterminded Enchanted’s blend of animation and live action. Tippett Studio created the photorealistic CG animals and insects for the “happy working song” and other scenes, the wicked mother (Susan Sarandon) who turns into a dragon for the film’s finale, and Pip, a disarmingly charming chipmunk that nearly steals the show.

In addition, CIS Hollywood contributed 36 of the 320 visual effects shots in the film, primarily wire removals and composites, WETA Digital created the first and last shots with the Disney castle and storybook, and Reel FX worked on two of the storybook shots.

“Pip was probably the biggest challenge for the entire project,” says Thomas Schelesny, visual effects supervisor. “He was unique. He’s the only guy in the movie who knows what’s going on.”
However, knowing what’s going on and explaining that to Giselle [Amy Adams], the almost-princess, and her handsome prince [James Marsden], are two different things, as Pip discovers. Once he lands in the real world, the cartoon chipmunk loses his commanding speaking voice and has to resort to squeaks and pantomime.


Tippett Studio’s CG chipmunk, Pip, dances a fine line between reality and cartoonanimation. He moves like a chipmunk, communicates with human characteristics, andperforms a cartoony pantomime.

“The kind of work Pip needs to do is not standard visual effects work,” Schelesny says. “We really needed to study how a traditional Disney animator would approach the character.”

A crew of approximately 12 animators, led by Tom Gibbons, worked on the show at Tippett Studio, and every animator performed Pip. “We got a flavor for Pip starting from drawings by John Baxter, who did the 2D animation,” Gibbons says. “We took those tidbits and information from Kevin [Lima] and Tom [Schelesny] and had a huge Disney history lesson. We watched all the films. We knew that Pip would be an homage to the early Disney films.”

When Pip is on the move, the animators stayed true to his animal nature. When he needs to communicate, they animated him with human characteristics. And, in one breakout scene, he becomes a furry cartoon character. In that scene, he uses body language to warn Prince Edward about the bad guys. “He changes his form to mimic the characters he charades,” Gibbons says. “We incorporated everything we know into his rig.”

As animators working in Autodesk’s Maya squashed and stretched Pip into various shapes, an internal muscle system maintained the integrity of his volume. For facial animation, the animators used blendshapes and widgets that ride outside the rig to move dozens of controllers that pushed around clusters of the mesh.

To view the animal’s silhouette without waiting for final renders, the animation team could output frames with low-resolution fur to check poses. Tippett’s proprietary Furocious software produced the final fur renders that moved on to the compositors. But, the compositors also had an earlier role in the process.



For the “Happy Working Song,” Tippett created CG pigeons andcockroaches (at top), and mixed them into scenes with real andCG rats, pigeons, and Amy Adams (above).

On set, Schelesny used a variety of techniques to give the actors Pip’s eye line: laser pointers, stuffies, sticks, and wires. Later, matchmovers provided a virtual camera for the animators’ 3D world that matched the live-action camera. The places the camera pointed and the directions the actors looked dictated where the animators needed to position Pip. But when the camera moves in the live-action plate limited the animators too much, the compositors retimed the plates using Twixtor (from RE:Vision Effects) within Apple’s Shake to add frames. Sometimes they even changed the camera moves. “If Pip looked up, we tilted the camera up,” says Chris Morley, compositing supervisor.

The compositors also added the last touches to Pip’s fur to integrate him into the plate. “The TDs did a great job lighting him, and then we took the renders and created two mattes—one based on z depth, one based on the depth away from the camera in y—from the ground up,” says Morley.

Using the y mattes, they added gradients that, for example, darkened Pip from the ground up to integrate him into shadows, and with the z mattes, they would sometimes de­focus Pip the farther he was from camera. “We’ve done a lot of furry animals, so it’s second nature for us to composite them in a realistic fashion,” Morley says. But never in a film quite like this one.

“I’ve worked on a bunch of films during my 12 years at Tippett and three years at studios in Canada before that,” Schelesny says. “This film was special.”

—Barbara Robertson


FLYING REINDEER
For Fred Claus, Cinesite sends Santa's brother to the North Pole

You might be able to buy the premise of the film Fred Claus, that Santa has a scallywag of an older brother named Fred (Vince Vaughn), but no Christmas comedy would be complete without reindeer, Santa’s sleigh, and elves.

A crew of approximately 35 people at Cinesite spent about a year helping Santa’s reindeer pull a sleigh from Chicago to the North Pole and applying visual effects wizardry to one of his elves. Originally, seven real reindeer on the set were going to pull the sleigh, but it was too heavy. “It was the size of an SUV,” says visual effects supervisor Simon Stanley-Clamp. “And the base was made from cast iron so it was strong enough to be thrown around on a gimbal. It took 10 of us to move it around the set.”


CG reindeer created at Cinesite pull Santa’s heavy sleigh inthe feature Fred Claus.

Those real reindeer, however, provided great reference for modelers working in Autodesk’s Maya and Mudbox to build their digital clones. Groomers clumped the reindeer’s matted fur in the front and folded the hair around their leather harness straps by using painted weight maps.

For rigging and animation reference, the crew traveled to a reindeer farm to see the animals on the hoof. But, to fly digital reindeer past Chicago rooftops, they watched videotapes of swimming horses. “The horses pull themselves through the water by stretching out their legs,” Stanley-Clamp says. As a result, riggers devised a system that allowed the digital reindeer to overextend their front and rear legs.

“Initially, we blocked out their flight path using simple splines to get a sign-off on direction and speed,” Stanley-Clamp says. “Then we roughed-up the animation so it wouldn’t look like they were on a roller coaster. As well as moving the sleigh, they needed to have their own movement.” Riggers arranged the gear that hitched the reindeer to each other and to the sleigh so that the animals could bank and turn while staying in formation, and procedural animation moved the hitching gear appropriately.

Once the CG reindeer and sleigh take off from a rooftop in Chicago, they fly past buildings and then zoom into the clouds and over hundreds of miles of terrain. “The sleigh is moving 999 miles per hour,” says Stanley-Clamp. “David [Dobkin, the director] wanted the cities to whiz by.” Because the sleigh covers so much ground, the crew built the terrain as digital environments in Side Effects’ Houdini using 3D models and projection maps. “One reason we used Houdini was to procedurally link repeated environments as we travel through layers and layers of clouds. Sometimes you see cities through the clouds, sometimes gaps with no twinkling lights. The gaps created the illusion of speed.” For a sequence at the end, the sleigh moves so fast that it becomes a streak.

“We created the streak with motion blur and by generating paint trails from the 3D sleigh,” Stanley-Clamp says. “The cuts are really rapid.”



Special rigs managed the reins for Santa’sCG reindeer (at top), and because thesereindeer really know how to fly, Cinesitealso created virtual clouds and the CGlandscape (below).

For the clouds, modelers built organic shapes in Houdini that the crew placed onto a grid. Using in-house tools, they derived a point cloud from the surface and interior of the cloud shapes. Then, they attached sprites to the points and rendered the clouds using subsurface scattering.

To complete the illusion that Fred is in the sleigh, compositors layered in live-action elements for the errant brother and Santa’s elf Willy, the driver, when the camera is close. For those shots, Willy himself is a composite played by Jorge Rodero (body) and John Michael Higgins (head). “During the live-action shoot, Jorge did the performance while John Michael delivered the lines,” explains Stanley-Clamp. Later, Higgins delivered the same lines while being filmed on a bluescreen stage, and Cinesite switched the actors’ heads. For most flying sequences, however, the studio created a digital double for Fred and Willy from cyber scans and photo textures. “We did cyberscans of John Michael’s head to put onto Jorge’s body to create the CG double of Willy,” he says. “In many respects, that was easier than the 2D solution.”

In addition to Maya, Mudbox, and Houdini, the Cinesite crew used Adobe’s Photoshop, Apple’s Shake, Autodesk’s Inferno, and three proprietary tools: CSfur, CSclouds, and CANI, an animation interface tool. By applying these tools, the talented crew at Cinesite convinced every mother’s child that reindeer really know how to fly.

—Barbara Robertson

NATURAL EFFECTS, UNNATURAL PEOPLE
Sony Pictures Imageworks creates a post-apocalyptic Manhattan and fills it with plants, animals, and vampires



I Am Legend, when Neville (Will Smith) moves through Times Square, he doesn’t see streets filled with people, neon signs, or yellow taxis. He sees a lion stalking a herd of deer. Three years after a rampant virus has caused the evacuation of Manhattan and its ensuing isolation, the island has returned to nature.

Of course, when director Francis Lawrence shot the Warner Bros. film on location, the city was as raucous as always. Sony Pictures Imageworks emptied the streets and did the dirty work that made the city look abandoned.

“We shot all over Manhattan,” says Jim Berney, visual effects supervisor at Imageworks. “So, we had to paint out all signs of life. The streets are cracked. Buildings are crumbling. There are weeds everywhere. The wildlife is back in town.”

For the buildings, Imageworks artists working in Adobe’s Photoshop turned out the lights, tattered awnings, aged and weathered surfaces, added procedural bird poop to photographs taken on location, and more, and projected those textures onto low-res representations of the buildings in 3D geometry.

Modelers built the structures using survey data and LIDAR scans, the latter for a virtual background surrounding Times Square for scenes shot on a bluescreen stage. “We surveyed every location,” says Dave Smith, digital effects supervisor. “When we needed higher-resolution, we used the LIDAR scans.”


For the final image (top), Sony Pictures Imageworks removedsigns of life from plates shot in Manhattan and tracked the camera(bottom, right), and then grew grass with a hair system andweeds with a proprietary plant-growing program (bottom, left).

In addition to re-creating plates, the Imageworks crew also built all-digital environments for a seaport sequence during which they destroyed a 3D version of the Brooklyn Bridge. Imageworks uses Autodesk’s Maya for modeling and a combination of Side Effects’ Houdini and proprietary tools for dynamic simulations. Houdini dynamics set the stage for destruction by simulating the collapse of the large structure; proprietary tools provided fine details. “We drove the crinkling, bending, and snapping of smaller pieces from the low-resolution simulation of bigger pieces,” Smith says.

Growing weeds through cracked sidewalks and streets proved to be more difficult, however. First, rotoscopers lifted people and buildings from the plates, and matchmovers provided a virtual camera. Then, texture painters worked on pre-made street and sidewalk tiles in various sizes, painting cracks and potholes that layout artists could fit into the plates.

Dave Stephens, visual effects animation supervisor, led the plant-growing team. “We had to populate large sections of New York with plant life,” he says. A hair system put grass in the painted cracks and potholes, and a plant-growing program developed within Houdini added a variety of weeds.

“The plants grew in each frame as they were rendered,” he says. “We had algorithms for each plant type, and every time the plant was instanced, it generated a unique variant within a predefined range. We could layer in premodeled flowers, buds, and leaves procedurally.” Particle trails placed behind characters or cars moving through a scene influenced a procedural engine that caused the plants to flutter as if they were reacting to those objects.

In addition to planting the streets with weeds, Imageworks filled the air with flocks of birds and swarms of insects. “We have birds, gnats, and insects flying around in every frame,” says Berney. “They’re in the soundtrack, so we needed to add a visual cue.”

For a herd of deer, which Neville and the lions both hunt, the crew produced 15 variations of male and female deer from two models, changing their size, nose color, and antlers procedurally. “If we didn’t like one set of antlers, we could flip a switch to another set,” Smith says.

But the crew’s most intense creature work went into producing a variety of wildlife and creating the “infecteds,” people who survived the virus and became vampires—all created by Imageworks. For these creatures, the modeling team worked from designs by Patrick Tatopoulos and cadaver reference to create athletic monsters with bodies wasted away by the virus. Texture painters gave them a sickly look.

“I think the creatures really push the state of the art,” says Berney, who watched the shot count grow from 400 to 800 during postproduction. “They’re a good blend of new technologies, but they’re still handcrafted.” 
   
Barbara Robertson is an award-winning writer and a contributing editor for Computer Graphics World. She can be reached at BarbaraRR@comcast.net.