By Barbara Robertson
All images © Disney/Pixar.If you didn't suspect, when you were a child, that a monster might be in your closet, you probably knew someone who did. But did you ever ask why a monster would be there? When Pete Docter asked the question, his answers put him on a path that led to a feature film. Next month, on November 2, Walt Disney Pictures will present that film, Pixar Animation Studios' Monsters, Inc. The animated film marks the directorial debut for Docter, who was supervising animator and one of four screenwriters for the enormously successful Toy Story, Pixar's first full-length film.
With Toy Story, Pixar crafted a new feature film genre, and with each film since-A Bug's Life, Toy Story 2-the studio has pushed the art and technology of computer animation further. This film is no exception. The studio gave three of the country's leading experts in dynamic simulation, all of whom work for Pixar, two of the hardest problems in computer graphics: simulating hair and cloth. The Monsters, Inc. production team also devised innovative techniques for creating more visual complexity than in previous films, and mastered some tricky rendering problems. All these advances in technology and technique were needed to serve the story. The main characters required hair and cloth simulations; the other techniques helped make the monsters' world believable.
That world began to take shape as Docter and his colleagues at Pixar thought about why those monsters are in the closet.
|
Modelers provided animators with nearly 40 percent more controls for Mike (at left) and Sullivan (right) than for Toy Story 2's Al. To animate Sully's 2,320,413 blue hairs, Pixar developed Fizt, a dynamic simulator. |
"We figured out that the monsters scare kids because that's their job. That's what they're paid to do," Docter says. Why? The monsters' world is powered by screams. "What the monsters do is go into a kid's room, harvest the scream, and then refine it into high- or low-octave scream for commercial- or industrial-grade use," explains Docter.
The two stars of the film, James P. Sullivan (voiced by John Goodman) and Mike Wazow ski (Billy Crystal) work for one of the big three energy companies, Monsters Incorporated. "Sully" is the company's best scarer; Mike is his assistant, best friend, and roommate.
"The way it works is that there's a big vault that houses millions and millions of kids' closet doors that can be anywhere in the world-Paris, Rome, Detroit," says Docter "It's all computerized. You get the kid's card and swipe it through a console like you're paying for groceries. The computer accesses the door, and it's brought up on what looks like a dry-cleaner track. The door mechanism grabs it, the computer activates it, and the monster goes in and scares the kid."
|
As Mike and Sully walk to work at the scream factory, we get a monster's eye view of the streets of Monstropolis, a town created entirely with 3D computer graphics that had its heyday in the late 1950s, when kids were still naive. |
There's a big problem, though. "Times are tough for the energy companies," says Docter. "These days, kids are not as easily scared because of video games and the media, so there's an energy crisis."
One day a human child follows Sully into the monster world, where children are forbidden: The monsters believe kids are dangerous because they contain so much pure energy. And indeed, hiding the cute bundle of pure energy is difficult. "When she gets loud, light bulbs pop, and all kinds of things go haywire," says Darla Anderson, producer. But with Mike's help, Sully tries to send the little girl, whom he's named Boo, safely home before she's discovered. In so doing, he crosses paths with a jealous co-worker and stumbles upon a dastardly plot to boost energy production. And the plot thickens.
|
By mixing real animal parts, such as ram horns, bear fur, and octopus legs, and giving realistic textures unusual colors, Pixar designers created some 50 odd-looking monsters. |
All told, the crew created 50 monsters plus variations and 22 different location sets, including kids' bedrooms, Harry Hausen's Sushi restaurant, the snowy home of the Yeti, the scream extraction factory, and the city of Monstropolis itself, where buildings are decorated with eyeballs and claws rather than flowers and vines. One of the most complex sets is the door vault inside the factory, which has 5.7 million unique closet doors, of which around 500,000 are on a mile-long "highway."
"It's probably the most complicated in terms of renderability," says Anderson, who notes that individual frames in some establishing shots, which have numerous moving doors and use atmospheric effects to emphasize the door vault's cavernous space, could take as long as 80 hours to render. That's a measure of the complexity of this film given today's processors-and Pixar has a ton of them: The studio's renderfarm now contains 3500 Sun Microsystems processors.
|
The shading team softened Boo's skin with "peach fuzz." When she gets mad, capillaries under her skin redden her cheeks. |
To measure and compare the compute power needed for its films over the years, Pixar uses so-called "RenderMarks." (To calculate how many RenderMarks a particular machine has, Pixar runs a predefined set of images through RenderMan and measures how long it takes to render frames. A 1000 RenderMark CPU computes the same frame twice as fast as a 500 RenderMark CPU.) The first Toy Story (1995) used 50,000 RenderMarks for rendering; A Bug's Life (1998) needed 700,000 RenderMarks; and Toy Story 2 (1999) took 1.1 million. Monsters, Inc. re quired 2.5 million Render Marks, more than the first three films combined.
The big jump for Monsters, Inc. is due in part to Sully's hair and Boo's T-shirt. Some 2,320,413 hairs cover Sully's purple-spotted, eight-foot-tall body, all dynamically simulated using new technology developed for this film. Also simulated is Boo's oversize T-shirt, which hangs loosely on her 2-1/2 foot tall body. The two stars appear in hundreds of shots.
The launching point for the new technology was a cloth simulation system created by senior scientist Michael Kass for Pixar's Oscar-winning short film Geri's Game (1997). Senior scientists Andy Witkin and David Baraff began working on the code base for the new simulator in 1998.
Sully and Mike transform from wireframe models (above left) into smooth shaded objects (above right), prepare for fur simulation (next image, left), and become finished monsters (next image, right). Sully's 28,000 "key hairs" (shown in the third image) provided information about hair characteristics to the dynamic simulator, Fizt. Fizt moved each of the millions of hairs based on the underlying animation, kept them from intersecting with each other and with objects such as the remote control, and even untangled them when necessary.This simulator, named Fizt, is the result of a long-time collaboration among the three scientists: Kass and Wit kin began publishing papers together on physically based modeling 15 years ago. Witkin and Baraff began working on cloth simulation at Carnegie Mellon University in 1992 and while there developed Maya Cloth for Ali as|Wavefront. "There were things [in the Geri's Game and Maya Cloth systems] that were the best we could do at the time, but frankly, they weren't good enough for what we needed to do for Monsters," says Baraff.
"We needed a solution that was robust enough to work in a production environment for a film in which fur and clothing appear in many of the shots," says Witkin. "Things had to work with little human intervention, and the simulation couldn't grind our pipeline to a halt."
Thus, Witkin and Baraff began developing a system that could model a variety of dynamic effects, with emphasis on hair and cloth. "Hair and cloth have a common foundation," says Witkin, explaining, for example, that both hair and woven threads are easy to bend and resist stretching. In addition, both can be broken into small pieces-hair can be represented as a curve that can be broken into a chain of particles; a thin sheet of cloth can be broken into particles or triangles. "You can find common ground by saying that ultimately everything in the simulator is going to be approximated as a bunch of particles. Then you can model the dynamics of the particles and the forces that interact and couple them."
"What's different is that the interaction forces are more complicated with clothing because you have things happening in two directions and because the problems due to collision and contact are more complicated," Witkin says.
The two biggest issues, according to Baraff, were cloth-to-cloth and cloth-to-solid collisions. In cloth-to-cloth collisions, the cloth wrinkles and folds on itself. "Picture what happens when you crumple a piece of cloth into a ball," says Witkin. "[The simulated cloth] has to keep from passing through itself." Cloth-to-solid collisions happen when cloth is draped on the human body. In this case, the challenging problems happen when cloth is draped on a character's body and gets sandwiched between two body parts. This typically happens in armpits and elbows and when animators place a character's arm next to its side. "The cloth goes inside the body and even intersects itself yet somehow you expect that when you move the arm, the cloth will magically settle back into a nice clean state," says Baraff. The problem is similar for Sully's fur. When his arms are tightly at his side, for example, his arm hair bunches and intersects with his body hair.
|
Usually, Fizt moved Boo's T-shirt based on her animation. Here, Sully's movement directed the simulation and Boo was animated to match. When Boo's T-shirt was first simulated, it had more wrinkles. The director preferred a simpler, cartoony l |
"If you don't know how to stay out of that situation or get out of it once you get into it, you can wind up with the cloth getting tangled in a way that's physically unrealistic and you can't find your way out. It grinds the simulator to a halt. Someone has to come in, iron out the wrinkles [or comb the hair] and start over. Even if that happens only from time to time, it can be a serious problem for the flow of production," says Witkin. Worse, because animators move only the body parts visible to the camera, body parts hidden from view often end up in strange and physically impossible positions-an animator might let an arm poke through a leg, for example.
|
By changing shader variants, the set-dressing department turned blank paper into posters to decorate the office walls for Roz, the slug-like receptionist. |
"It's an exceedingly difficult problem, but we thought we knew how to solve it coming in," says Baraff. Their idea was to have the animators avoid creating these problems. "The word we got back rather quickly was, 'No. The animators will continue to do what they have always done. You will find a way to make the simulator work correctly in the presence of that,'" he says.
They solved the problem by creating a physically based simulator that was tolerant of physically unrealistic behavior. The simulator knows how to keep tangles and interpenetrations from happening for the most part, but if they should happen, it can repair the cloth (or hair) and put it back into the expected shape. Thus, because interpenetrations can happen for a short time in a relatively small place, animators usually did not have to change the way they worked. Simulation and effects sequence supervisor Mark Henne, who helped turn the simulation technology into a viable tool for cloth dynamics, says, "We had to send the cloth simulation back to animators because of intersections in only about 10 percent of the cases."
Thus, although simulation added a step to the pipeline, the process didn't create a bottleneck. In addition, the scientists accelerated the simulation by using such techniques as multithreading. They cite 17 minutes per animation second (24 frames) of output for cloth simulation, and 10 minutes per second for Sully's hair. Pixar won't reveal details about the solution other than to say it involves "analyzing the geometry of the way the cloth is intersecting itself." The studio has filed for patents on the technology, and the scientists expect to publish technical papers describing the process next year.
|
Set dressers choose 3D models of props from an electronic catalog and used shader variants to alter them, changing the coffee cup's texture and the rug's design, for example, to create unique locations. |
As did Henne for cloth simulation, simulation and effects sequence supervisors Michael Fong and Steven May brought hair simulation into the production process. They put Sully through obstacle courses to test the simulator, created tools, methods, and shaders for growing, grooming, and rendering the hair, and even created "wind widgets" for specific situations. For example, the underlying engine for growing hair is a RenderMan DSO (dynamically shared object) written by May that distributes hair on the character, reads data from the simulator, and then runs a shader called a "builder" for every hair. "The builder has information about each hair-its length, taper, color, and other characteristics that make it unique." And Fong wrote an interactive 3D grooming tool for the 28,000 "key hairs" that are procedurally distributed on all the vertices on Sully's model.
Because simulation added another process to the production pipeline, Pixar created a new "shots" department under the direction of Galyn Susman, simulation and effects supervisor. Susman explains, "We're the catch-all, can-do-it department. Once a shot leaves layout and goes to animation, we assign a shot TD [technical director] to it. The shot TDs are responsible for simulation, effects, and rendering passes. They also help lighting supervisors and optimize rendering." Shot TDs, for example, built collision objects used by the simulator and created atmospheric effects. They put animated scenes onto video monitors for shots within shots and created 3D matte paintings that provide a quick glimpse of what's behind a closet door. And they created other types of effects.
"Some things don't fall easily into typical categories, but a lot of what we do is make the characters feel like they are stitched into their environment," Susman says. "We put footprints in the snow, or make a soft cushion respond to a character's touch."
While most people in production use Pixar's proprietary software, Susman's group uses "anything and everything we can get," she says, listing, in addition to Pixar's internal tools, Nothing Real's Shake, Ado be Systems' After Effects, and Inter active Effects' Amazon Paint. "And we use Maya a ton. We do all our particle work in Maya." Alias|Wavefront's Maya is also used to create models at Pixar.
In addition, the shots department is responsible for final rendering. Tom Porter, supervising technical director, explains, "We used to have a rendering group at the tail end of the process and by the time the shots got there they could be huge. Now, the shot TD is responsible for the shot as it moves through the pipeline."
|
Although Sully, Mike, and Mike's snake-haired girlfriend were hand-crafted, background monsters, such as the yellow guy in back, were assembled from parts. The shading team developed modular pieces of shading to blend the parts together. |
One rendering problem in particular was posed by Sully's hair. "We got tremendous help from the RenderMan team," says Rick Sayre, shading supervisor. "They completely rewrote the way hair is rendered and made it a lot faster and more efficient." For example, a team led by Render Man engineer Craig Kolb devised new algorithms to reduce the number of micropolygons generated for each hair and to use parallel processing for the hair shading. Another "hair" challenge involved lighting and shading. "The way hair responds to light is intimately tied with shadows. If hair doesn't have self-shadows on it, it doesn't look right," Sayre explains. "And the way we work is to think of shadows as a conscious choice. They don't just happen." But Render Man's shadow maps, which are basically on or off-something is in shadow or it's not-didn't produce the effect they wanted. The shading team also discovered that classical hair illumination models, even if mathematically sound, don't look right. Instead, they used an algorithm called Deep Shadows developed by Tom Lokovic, based on research published in the paper, "Deep Shadow Maps" by Lokovic and Eric Veach, Pixar RenderMan engineers, in the Siggraph 2000 Proceedings. With Deep Shadows, they were able to self-shadow the hair using colored shadows with transparency, and were able to have motion blurred objects cast shadows. "One of the best uses for Deep Shadows is for hair," says Dan McCoy, shading technical director.
|
|
The shadowing technique was also used for the fog, steam, and atmospheric effects that occur in Monstropolis, because screams look a bit like steam. To help shot TDs and lighting supervisors work easily with these effects, the shading team created little programs, DSOs, which could be linked to the renderer. "You could have a foggy sphere, a foggy box, a volume of fog in the room, or little puffy fog stickers," says Sayre. "For example, light beams shining through particulate matter in fog helped give the door vault greater depth.You can treat fog as a first-class object, put lights on fog or not, and use different intensities in the shadows for fogs."
Another consequence of being powered by steam is that there are steam-carrying pipes everywhere. To create this network, the modeling department made a construction kit of pipes and connectors that the set-dressing department could put together. Tony Apodaca, who worked on production shading for Monsters, Inc., created an intelligent shader that worked with the construction kit. "The shader would know where it was in space, so it made the pipe a little corroded at junctions, and scratched the pipe based on how big it was," says Sayre.
This is the first time the shading team worked as closely with set dressing as with lighting, where final touches are added to a model's surface. For this film, the shading department helped make lighting easier and also gave lights a new purpose."The way we light is very close to painting," says Jean-Claude Kalache, supervising lighting lead. "We look at a corner and think we need a highlight or small shadow and then step back and look at our work." Thus, an average shot in Monsters, Inc. had 200 lights.
To help make the hard problem of lighting Sully's hair easier, the shading department gave the lighting team tools that allowed them to accurately light the furry monster by lighting only his skin, which allowed the process to go much faster than if his hair had to be visible. And, with clever "coco" (color correction) lights developed in the shading department, the lighting team could "paint" monster models with colored lights to create the visually complex illusion of hundreds of unique monsters in the background.
The artists in the set-dressing department gave the film much of its most obvious visual complexity, however. "Essentially, when a set comes to us from the modeling department, it's just walls and windows," says Sophie Vincelette, supervising set dresser. To help set dressers turn bare rooms into complex stage sets, the modeling department provided an electronic catalog with hundreds of 3D props-everything they could imagine would be needed in a real world-paper, pencil, computer, desk, chair, and so forth for factory offices; a manhole cover, litter, lights, and other accoutrements for streets; a poster, book, magazine, and toys for kids' rooms, and so forth. The shading department then created textures and shaders and gave set dressers controls they could use to vary the look of models. "They could order Burp cola, number seven, slightly scratched, from the catalog," says Sayre.
The set dressers often took their cues from the story. Take Mike's desk, for example. "They're always saying to Mike in the movie, 'Don't forget to file your paperwork,' so we wanted to make it look like he doesn't file his paperwork," says Vincelette. They piled folders of papers on his desk, and on the floor around the desk if it were late in the day, using the shader variants to bend and fold the paper, and to change colors and texture maps.
The set dressers also helped give the city its late '50s look. "We try to put history in the worlds we create," says John Lasseter, executive producer. "Not only do we do a back story of each character, we do a back story of the buildings." The back story in this case is that Monstropolis's big growth happened in the late '50s, when the baby boomers were still young, naive, and easily scared.
This attention to detail and to creating visual complexity permeates the film. "As a director, you work really hard to make the audience suspend their disbelief," says Lasseter, who directed Pixar's previous features. "You take them into a world where they are just with you. If any visual thing happens that doesn't seem right, boom, you've lost them for a few seconds."
Adds Ed Catmull, Pixar founder and president: "From a technical point of view, I think we're getting closer to the point in our films where you don't notice the technology. If people pay attention to the effects when they watch our films, then we haven't done our job. But, for example, if the hair looks so good that you don't think about it because it feels like a natural part of the character, then we will have succeeded. If movement is too simple, then it stands out. We've become more complex in order not to be noticed."
What will be next, then, for the studio in terms of technology?
"It used to be barely possible to make these films," says Catmull. "Now the difficult challenge is to make it easy."
Barbara Robertson is Senior Editor, West Coast, for Computer Graphics World.