By Barbara Robertson
How do you make time fly? That's the question Digital Domain had to answer visually for The Time Machine, a film based on the novel by H.G. Wells, which be gins in the 1890s and proceeds to 800,000 years in the future. Directed by Gore Verbinski and Simon Wells, great-grandson of H.G. Wells, the DreamWorks film is scheduled to open March 8. Guy Pearce stars as Alexander Hartdegen, the scientist/inventor/traveler who zips into the future and discovers two races: Morlocks and Eloi, the hunters and the hunted.
The movie's mixed reviews may limit its future at the box office; however, during the process of creating effects, Digital Domain and Industrial Light & Magic solved some color range problems that will impact the future of filmmaking.
Digital Domain had the lion's share of the work, handling 250 shots for the three time travel sequences. In addition, ILM created CG Morlocks, Illusion Arts provided matte paintings, and Cinesite took care of wire removal, changed makeup, and helped destroy the world.
The first of the three time travel sequences takes Alex from the turn of the twentieth century to the year 2030 in less than three minutes. Alex has built his time machine in a greenhouse behind his Manhattan brownstone. "To show time passing, you see plants inside the greenhouse die, but outside a vine starts to grow. Winter comes, snow forms on the vine, spring flowers come again," says David Prescott, CG supervisor. When the camera pulls out of the greenhouse, a long "powers of 10" shot takes the viewer from Manhattan to the moon. "The difference is that during the powers of 10, we had to show time travel," says Prescott. Alex's house is torn down and gets rebuilt, New York grows skyscrapers, traffic increases, seasons come and go in Central Park, weather patterns form, and satellites orbit the earth. "Typically these things are done as nested 2D images, but that was ruled out because we start so close and because the camera made it a three-dimensional solution," says Erik Nash, visual effects supervisor. Because the camera traveled inside the city, rather than simply switching from one painting to another, the team had to create portions of the city in 3D. The 800 frame, 33-second shot is entirely CG from start to finish.
During the second time travel sequence, time flies even faster as Alex zooms from 2030 to 802,000 in only a few seconds. In this sequence, the city becomes a desert and then a grand canyon that is overtaken by an ice age. During the sequence, each frame had to look photoreal even though each element in the frame was changing. To create the elements, Digital Domain developed innovative terrain generation software; to weave the animated elements together and make coherent frames, the compositors used the latest version of Nuke, Digital Domain's proprietary compositing software.
For the terrain, starting with USGS data of canyons, the team generated "before" and "after" height fields with the help of Terragen software. The height fields were used to drive density functions. "The idea is that the terrain will erode in the weakest, least dense places," says Johnny Gibson, technical director. "The height fields tell us where deep areas are and these are the less dense areas." The "before" and "after" height fields were used as keyframes for the animation; erosion algorithms changed the landscape between the keyframes, creating the animated elements. The height field and density functions also provided a basis for displacement shaders. In addition, the team created plant life, rubble from a collapsing city, sand dunes crossing the terrain, and snow elements-all in Side Effects Software's Houdini animation and V Mantra rendering software.
As hard as it was to create the moving and eroding terrain and other elements, stitching the elements together into photorealistic frames and making the sequence look like hundreds of thousands of years are flying past in a few seconds without becoming a blur was equally tricky. "We stopped trying to figure out how much time would pass per frame and just made it look good," says Bryan Grill, compositor. "And, we added qualities that would ground the shots." In one shot, for example, the compositors added rain, lightning strikes, and stormy clouds.
The third time travel sequence begins when Alex gets into a fight with the über Morlock, drags him into the time machine, and kicks the machine into gear. During this sequence, Alex forces the Morlock outside the time bubble that surrounds the time machine when it's active. Once outside, he is not protected from aging and withers away to dust and bones. "All 80 shots in the time bubble are light effects," says Prescott. "Standard 8-bit linear images wouldn't have given us the range of colors we needed, so we wrote our own image format based on high dynamic ranges." They've named the format rgbe.
The idea is based on the paper "Recovering High Dynamic Range Radiance Maps from Photographs," by Paul E. Debevec and Jitendra Malik (Siggraph proceedings, 1997.) "Instead of having color in red, green, and blue channels only, this format has floating point information in r, g, and b as well as an exponent channel," Prescott explains.
|
When the time machine is active, a "time bubble" surrounds it. Everything inside the bubble-such as Alex Hartdegen, played by actor Guy Pearce-is transported in time. The bubble is made of light effects that are so bright it was difficult to create them w |
Coincidentally, ILM has also developed an extended range format, EXR, and The Time Machine is one of the first films to make use of it. "The new floating point scanning format allows us to capture a larger dynamic range," says ILM's Scott Squires, visual effects supervisor. "Because it's floating point, we can get more than 16 bits of color per channel."
Although scanners can capture more than 8 bits per color from film, most graphics software can accommodate only 8-bit color and monitors can display only 8-bit color, so the scanned data is typically limited to an 8-bit color range. Sometimes, that's not enough. "You can get an image on film that's 12 times brighter than what can be displayed on a monitor," explains Jonathan Egstad, digital effects supervisor at Digital Domain. "So when you convert those images to 8-bit rgb, you have to decide what area of exposure you care about and chop away everything else." That can come back to haunt the production later down the line.
"When we shoot something out to film, we give the client a negative and they can print it any way they want," explains Grill. "If there are multiple light effects in a scene and they want to print it darker, with 8 bits you can only darken the negative a certain amount before the bright white goes gray. There isn't enough information for it to stay bright as you darken the film." Equally important, at the other end of the color range detail can be lost in the shadows.
For The Time Machine, ILM created digital Morlocks and in one scene, the creatures race on all fours through a dark cavern. "It would have been impossible for the actors wearing the Morlock costumes to run that fast on all fours," says Carl Frederick, CG supervisor at ILM. And without the new EXR format, details in the dark, underground world would have disappeared.
Increasing the dynamic range solves the problem of getting details in shadows and retaining bright lights-but it creates others. "A lot of programs needed to be modified to handle the greater bit depth," says Frederick. In addition, the studios had to devise ways for the effects team to look at the entire spectrum of color on 8-bit monitors. "We do that by shifting which part of the spectrum we're looking at with the software," says Frederick. Similarly, Digital Domain added f-stop tools to Nuke.
Grill was concerned at first that the new color space would make his work more difficult and not just because he had new tools to learn. "I was afraid I would have to keep an eye on every little thing now," he says. "Before, when we scanned production footage, I knew we'd lose 2 bits right off the top, so I could think, 'it'll chop off, so I don't have to worry about it.' But, when we brought up old images from other shows and compared them, it was like night and day. I wish we could have done this before."
Ultimately, both studios hope that others will implement some type of extended range format for storing color information to make information exchange easier and that software companies will accommodate the new formats. "There are lots of adjustments software companies could do that would be really useful," says Gibson.
"The time machine effects would have suffered tremendously in 8-bit renders," adds Digital Domain's Nash. "It [rgbe] was absolutely invaluable."
Barbara Robertson is Senior Editor, West Coast, for Computer Graphics World.