Consider the premise: a film that focuses most of its screen time on one shipwrecked teenager contemplating god while trapped on the open ocean in a 27-foot lifeboat with a Bengal tiger. A remarkable vision. But, how could a director make such a film?
The director is Ang Lee; the film, Life of Pi. And the answer is, with one very large wave tank, an ocean of sophisticated digital visual effects, and a remarkable digital tiger. Bill Westenhofer of Rhythm & Hues (R&H), who received an Oscar for The Golden Compass and an Oscar nomination for The Chronicles of Narnia: The Lion, the Witch and the Wardrobe, supervised the visual effects. Three actors played the role of Pi: Ayush Tandon was young Pi, Irrfan Khan was older Pi, and Suraj Sharma was Pi during most of the film. The 20th Century Fox release opened the New York Film Festival and won the audience award at the Mill Valley Film Festival. Tom Shone writing in The Guardian calls Life of Pi “gently astonishing.”
Based on the best-selling fantasy-adventure novel by Yann Martel, which won the Man Booker Prize for fiction, the film begins with Pi as an adult telling his story to a writer. When Pi was young, his father owned a zoo in Pondicherry, India, where the boy studied Hindu, Christianity, and Islam religions, and gained some understanding of animal psychology. Both fascinations would soon become startlingly important. When Pi became a teenager, his parents decided to move the family to Canada and ship some zoo animals with them. After a few days at sea, though, the ship capsizes in a huge thunderstorm.
“Even as the waves heave and roll (to especially fearsome effect in stereo 3D), the film finds room for isolated moments of haunting poetry, such as the sight of the ship’s ghostly white lights descending into the abyss,” writes Justin Chang in Variety.
A poetic abyss extended with stereo 3D and created with computer graphics.
“It’s a very artistic movie,” says Westenhofer. “The first day back after principal photography, Ang [Lee] said, ‘I’m looking forward to making art with you.’”
The resulting visual effects included digital ocean extensions, digital skies, the ship sinking in the storm, a second storm, four major CG animals, sea life, meerkats, matte paintings, and set extensions—all told, 690 visual effects shots in the film. But, that number doesn’t tell the whole story.
After the storm, Pi finds himself in a lifeboat with a Bengal tiger named Richard Parker, a spotted hyena, a zebra with a broken leg, and an orangutan. Soon, only Pi and Richard Parker are left, and despite the human name, the tiger is a beast.
“Three-fifths of the movie is on the ocean with Pi smack dab on the surface of the water, and we knew immediately that we couldn’t shoot a 3D movie on the open ocean,” Westenhofer says. “Ang Lee embraced stereo 3D, and in order to feel the stereo and get the look he wanted, he preferred long shots that allowed you to breathe. There are more shots with 2000 frames in this film than any other in my career; only 960 shots in the entire movie. We have an hour and 29 minutes of visual effects.”
The orangutan is real when in the cargo hold of the family’s ship, but on the lifeboat at top, she’s digital, created at Rhythm & Hues, as are all the meerkats at bottom.
Artists at Rhythm & Hues in Los Angeles, Kuala Lumpur (Malaysia), Vancouver, and Mumbai and Hyderabad (India) created most of the visual effects, including the digital water, the tiger Richard Parker, the sea life, and all the other animals except those in a dream sequence realized at BUF and those on the sinking ship. The Moving Picture Company artists took charge of the sinking ship and the two storm sequences. The actor Sharma becomes thinner thanks to artists at Lola. A Crazy Horse crew created matte paintings to augment and alter sequences filmed in India. A team at Look Effects handled approximately 30 shots that included extending the cargo hold on the ship and enhancing scenes shot in India by placing boats with candles in a lake, painting out mountains from backgrounds, and removing modern details during a rainy street scene to create the proper period. And, artists at yU+co manipulated 3D transitions and produced the titles.
“This is not the typical stereo 3D movie with superheroes and explosions,” Westenhofer says. “It’s a drama. If it works, it will answer the question, ‘Why 3D?’ ” He provides an example: “There’s a pretty shot of the sun breaking through a cloud and the water shimmering with silver light. Pi stands up and calls for help. He’s in the middle of this beautiful scene and you realize he’ll never get rescued. You feel his desolation more because of the [stereo] 3D.”
In the Tank
A crew at Halon created the previs, working in stereo 3D with Ang Lee early in 2010. “Anything to do with the water was previs’d,” Westenhofer says. “Ang wanted the previs to be what he would shoot, and it pretty much is, but the shoot was so physically demanding that he felt he barely got what he needed.”
To film Sharma in the lifeboat, and sometimes on a makeshift raft near the boat, the crew used an enormous 30x70-meter wave tank nearly the size of a football field built in Taiwan specifically for this film. Even so, filming was a challenge.
Lee wanted carefully crafted camera moves, often long camera moves, but the stereo camera rig was bulky, and it was hard to move it around and over the tank. Moreover, the set could drift away. “The camera operator had a lot of difficult days trying to nail the cameras coming in on a boat sloshing around,” Westenhofer says. “But, we didn’t want to tether the boat too much. Shooting on water, even in a tank, is very difficult.” On days with heavy winds, the crew filmed the actor in a boat on a gimbal, but they shot as much as they could in the tank.
It would be up to the visual effects crew at R&H, then, to flow the water in the tank out into infinity, blend the shots with the boat on a gimbal into those filmed in the tank, and give Lee the look he wanted.
The tiger Richard Parker is digital in 86 percent of the shots, including this one in which a digital boat floats on real water filmed in a wave tank.
“Other tanks have non-directional waves and a bathtub effect in which water rebounds off walls,” Westenhofer explains. “We had wave-making equipment designed for theme parks that could produce four- or five-foot waves for 14 seconds, but Ang wanted more.” That is, the director often wanted bigger waves that lasted longer.
“It was a significant technical challenge to place the water in the tank on top of a digital ocean that was moving,” Westenhofer says. “And, we had to do that in stereo. It was one of those things that we acted confident about without knowing how we could do it. The plates we got were of Pi on the water in a wave tank, in front of bluescreen, so it was the job of visual effects to come up with the look. The water was the hardest, most technically challenging effect. But, we pulled it off.”
The Ocean
To extend the water in shots filmed with Sharma on a boat or raft in the tank, R&H engineers created a procedural wave system within Side Effects’ Houdini. “On rehearsal days, we shot a ton of reference,” Westenhofer says. “We painted a grid system on the wall so we could measure the wave heights over time. Then, we calibrated our wave tools to that, dialing the frequency up and down into the octaves of our [digital] waves.”
With this system, the simulation artists formed and saved 60 different wave types that they replicated and used to create the ocean surface for various shots. Then, they added fine wave ripples that matched patterns created by wind machines on set, and simulated white-water spray and foam as needed. Some of the hardest water shots to create occurred after the ship sank and the storm subsided somewhat. Finding the right wind detail and heavy chop for those shots took three months.
Artists at Rhythm & Hues referenced footage from four tigers on set to create its realistic CG tiger. New raytracing tools in the studio’s Ren software bounced light from the environment into the tiger’s fur.
When Lee wanted bigger waves than those filmed in the tank—a higher or longer surge—the crew would track the lifeboat in the plate, measure the period of the waves to know the timing, create a larger surface procedurally, and hand the problem to Meg Morris, CG camera supervisor. “She would have the camera as a child of the boat and try to make the boat ride on a bigger surface,” Westenhofer says. “It took almost an animation sensibility.”
As might be imagined, the crew often modified the water on a shot-by-shot basis. “Ang [Lee] wanted the ocean to be as much a character as anything else,” Westenhofer says. “And, he wanted everything to be authentic, so we did a ton of research.”
Westenhofer even shot reference from a Coast Guard cutter in heavy seas off Taiwan to bring back oceans in different lighting settings and with the wind rippling the surface in a various ways. “A slight change in light angle or wind patterns creates myriad looks on the surface, and we strove to achieve that,” he says.
The Storm
A crew of approximately 200 people in The Moving Picture Company’s (MPC’s) Vancouver office became involved in creating the film’s two storms. Supervisor Guillaume Rocheron worked on the project for nearly two and a half years, with two of those years in postproduction.
The biggest sequence for the studio in terms of screen time and complexity happens early in the film when a huge storm sinks the family’s ship. “We had CG water, the animals that escape, CG doubles, and the ship,” Rocheron says.
On set in Taiwan, two set pieces represented the ship’s upper and lower decks, each close to 60 feet, mounted on a gimbal, with bluescreen behind. MPC artists extended the practical sets and surrounded them with CG water. As soon as Pi goes into the water, though, the ship is all-digital, based on blueprints from the art department, and built in high enough resolution and with enough geometric detail to hold up in all-CG shots with CG characters and crushing waves. For textures, the crew photographed ships in Vancouver harbor.
“Our main challenge was dealing with the ocean,” Rocheron says. “We’ve created large-scale fluid simulations before, and it’s something more and more people know how to do, but the challenge for us was to work on controlling the ocean so that Ang [Lee] had the ability to choreograph the shots. He is not the sort of director who leaves anything to randomness. Everything is precisely art-directed and choreographed. He’d want a wavelet at frame 22 to be precisely a quarter-frame high. And it was clear that he would not have time to wait for simulation iterations.”
New facial controls helped the crew of 50 Rhythm & Hues animators working in Los Angeles, India, and Kuala Lumpur create Richard Parker’s performance.
Thus, rather than tweaking simulation parameters to create the storm, the crew decided to block in composition and timing through a layout stage first. For the ocean surface at this stage, they used geometry—that is, a grid. A Tessendorf deformer shaped the grid surface into waves based on parameters that described a Beaufort Scale of wind force and a consequential probable wave height.
“The Beaufort Scale references storms from one to 12,” Rocheron says. “We went for storm force 12 for both our storms. That produced waves 250 meters long and 15 to 20 meters high with a period of 10 to 12 seconds between waves. Our team created [Autodesk] Maya deformers programmed with these properties to shape the waves. And, on top of that, we could keyframe waves manually to move particular waves and make them faster or slower.”
The result gave the director a close representation of the final look for approvals. “It doesn’t have the quality of a simulated surface, but it was convincing enough,” Rocheron says. And, animators had a choreographed surface of the water they could use to animate the lifeboat.
“The geography of the ocean wasn’t going to change, so the animators could do full-shot animation,” Rocheron says. “The simulated water would interact with the boat, but it wouldn’t change the surface.”
Rhythm & Hues artists used Massive software to control the digital flying fish in this all-CG shot.
With the water surface and choreography created and approved, the MPC crew then used Scanline’s Flowline to move the water flow on top. “Our R&D and effects departments worked with Scanline to develop methods to simulate on top of the layout surface,” Rocheron says. “The surface drives a full 3D simulation, but instead of calculating 200 meters of ocean depth, we simulate only two to five meters deep to compute the water flow.” Eliminating the number of voxels in the depth allowed them to compute a larger surface and, thereby, produce an extremely large-scale ocean.
With the water flow moving along, the artists then added dynamic spray, mist, bubbles, and foam. “We simulated the wind, as well,” Rocheron says. “The spray fell into the water, became bubbles, and turned into foam. We used Flowline to simulate all those particles.”
Oceans of particles. “We reached 1.5 billion particles for one shot, which was kind of a record for the studio,” Rocheron says. To render the stormy seas, the crew split the simulations among 20 machines in the renderfarm. Even so, some shots took a week to calculate.
Although the team would usually replace the water filmed in the wave tank, rotoscope the boat, and place it into the fully CG ocean, the plates provided a base for lighting design and for creating the skies, which were animated matte paintings.
The storm that sinks the ship happens at night, and in those shots, the main light source was the ship itself. “It was hard,” Rocheron says. “We had lots of reflections in the ocean and a lot of raytracing and subsurface scattering of the ship’s lights into the ocean. We did almost a year of pre-production to discover how we would simulate and render those shots. They took a long time to render, but they were faster than we planned for. And, we could render the giant water surfaces and particles at the same time.” For rendering, MPC uses Pixar’s RenderMan; for compositing, The Foundry’s Nuke.
During a later storm, which happens in full daylight, Lee wanted the ocean to develop a different character. “It happens at the peak of Pi’s craziness and loneliness,” Rocheron says. “[Lee] wanted a vision of the fury of the ocean. Instead of having it look extremely realistic, he wanted us to push it. Put in more wind, more spray, bigger clouds. Make it more dramatic. But the process for creating the water was exactly the same.”
When a shot became too dramatic, the crew used a digital double for Sharma. “We could make the tank shots work when the camera was close to Pi and we could replace all the water,” Rocheron says. “But, it was technically impossible to shoot any medium or wide shots with waves between 15 and 20 meters high and 20 meters long. The only thing possible was to make them all-CG shots.”
The crew also created and animated a zebra that jumps into the lifeboat with Pi during the storm sequence. “We got the model and textures from R&H and conformed that into our pipeline,” Rocheron says. “It was pretty standard—we’ve been doing fur and muscles for years. But, we spent a long time working with Ang and Bill [Westenhofer] deciding how visible the zebra had to be. Ang said the most important thing was the relationship between Pi and the [sinking] ship, so we needed to have that separation. But we also had to put the zebra in the shots, so we kept the performance subtle.”
The other animals on deck were in the background in the shots and the tiger was in the distance. “When the tiger was in a close-up, we shared the shots with Rhythm,” Rocheron says, noting that R&H created and animated the tiger in those shots.
Richard Parker, Tiger
A real tiger, one of four cast for the film, appears in 14 percent of the shots. The rest of the time, Richard Parker is digital, a CG tiger created at Rhythm & Hues. “We’ve done digital animals before,” Westenhofer says. “We try to make them as real as possible, and then they go sing and dance. This was our opportunity to really fool somebody. So I wanted to base every shot on reference. I told Ang [Lee] it is too easy to anthropomorphize something even if you don’t want to. I wanted a real tiger in there to cut with. I wanted to force the bar to be pushed.”
As a result, Westenhofer found himself with four real Bengal tigers on set. “They gave us better reference than we would have gotten elsewhere,” he says. “We had eight weeks with real tigers in the real lighting situation. And Erik De Boer, our animation director, had so much reference that we could comb through. He got close-ups of paws moving.”
Three of the Bengal tigers the crew brought to Taiwan came from trainer Thierry Le Portier’s collection of wild cats in France, including the one Ang Lee picked as reference for Richard Parker. “When Ang saw that one, King, he said, ‘This is Richard Parker,’ and Thierry said, ‘Oh great. He’s the hardest one to work with.’ ” Footage of King in France and on set provided modelers and look-development artists led by Betsy Asher-Hall with specific reference.
The fourth tiger, Jonas, came from Canada. “They all had different qualities,” Westenhofer says. “King, his daughter, and his cousin from Thierry were all wild. But, Jonas was more bottle-fed.”
Richard Parker, who is approximately six feet long from nose to butt, spends much of his time during the course of the film in a small part of the 27-foot lifeboat not covered by a tarp. “He sits or paces in a little section,” Westenhofer says. “Pi’s territory is on top of the tarp, but it’s too yielding for the tiger.”
To help prepare the tigers for shots in the boat, the trainers put the cats in a den that also contained a boat on a gimbal, a fake camera, and a bluescreen. Although there was no water under the boat, the arrangement later provided Lee with footage of a tiger in a rocking boat that R&H compositors could blend with other live-action plates and with digital water.
“We could use the real tigers for single shots, nothing too crazy,” Westenhofer says. “The way Pi comes onto the boat without getting mauled is that there’s a tarp covering the boat that’s half open.” Usually, the tarp was practical, but when the digital tiger interacted with it, the tarp was digital, too. “[The Canadian tiger] Jonas could come out from under the tarp in the boat. The other tigers would have shredded it,” he adds.
For one sequence, the crew filmed one of the tigers jumping into the water. “The underwater DP was sitting in the tank with an underwater camera, and the tiger swam over his head,” Westenhofer says. “It might be the only 3D shot that exists of a tiger swimming.”
But during most of the movie, Richard Parker was digital. Creating the digital tiger was not new territory for the crew at R&H, who had, after all, created the lion Aslan in Narnia. “There are a lot of under-the-hood things, but fundamentally he’s the same as the lion, but done in a better way,” Westenhofer says. “What makes Richard Parker excel was not technology, but the attention to detail. Asher-Hall spent a year leading look-dev.”
Two areas of technology were new, however: raytracing and subsurface scattering in the fur. “He has 10 million hairs, and we were able to raytrace the hairs completely using our software Ren. We got a full bounce light contribution from the lifeboat and the environment lighting. The global illumination was as true as possible. And, we also had subsurface scattering in the fur bouncing around, casting a warm glow in the shadows.”
The muscle and skin simulation system was not new. However, the tiger needed an additional skin layer. “A tiger has hard muscles with a loose skin sack,” Westenhofer says. “The muscle simulations did their thing with a skin simulation that rode like a tight sock, and then on top we had loose skin that could fold and wrinkle. We also did a lot of R&D with facial controls.”
Modelers and riggers built a facial animation system using blendshapes based on reference. Tons of reference. “The tiger was hard technically, but it was harder performance-wise,” Westenhofer points out. “The tiger does what a tiger would do, but he does a lot. There’s definitely a character arc.” De Boer led a crew of 50 animators working in Los Angeles, India, and Kuala Lumpur who created Richard Parker’s dramatic performances.
Throughout the film, Pi feeds fish that he catches to the tiger, which is one way in which Pi preserves his higher status. But, as the film progresses, the tiger becomes thinner, almost emaciated.
“There’s a really tender moment near the end when Pi pulls the dying tiger onto his lap,” Westenhofer says. “During production, we sat with Ang and asked Thierry, the tiger trainer, what a tiger would do.” The trainer described a time in which one of his tigers, riddled with cancer, was near death. “He said the tiger let him pet her, and as he did, she looked up at him,” Westenhofer says.
To film that scene, the director had Sharma pull a blue sock onto his lap that the crew later replaced with the digital tiger, and simulated the interaction of Pi’s hands in the tiger’s fur.
Some of the most technically difficult shots for the crew at Rhythm & Hues, though, were those in which Richard Parker is in the water. “We had a couple with the tiger jumping into the water and shots in a swamped boat in the storm sequence, where MPC did its work,” Westenhofer says. “We had a convoluted pipeline for the water shots.”
First, the water simulation crew created a basic flow. Then the animators performed the tiger moving back and forth, keeping his head above water. That went back to water simulation to provide the interaction of water with the tiger and vectors for the fur dynamics. That result went back to the water team to add splashes.
“The water simulation told the technical animators which parts of the fur were soaked and what’s trickling up from the body,” Westenhofer says. “We went round and round and round the pipeline. The hardest shot and the one we spent the most time on was wherein Pi rips the tarp off the tiger and the tiger freaks. It had to be completely digital. The tiger with wet fur washes up into Pi in the swamped boat.”
More Animals
When MPC’s zebra lands on Pi’s lifeboat after the family ship sinks, it breaks a leg and a hyena kills it. A real hyena appeared in about seven shots, but was otherwise a digital double created at R&H. “One of the unique things about the hyena is that when you tickled her, she’d smile,” Westenhofer. “But that made her look ferocious. I’m proud of the tiger, but I think we took the hyena even further.”
Legacy Effects created a “dead,” practical zebra that was on the boat during filming, but R&H’s effects artists inserted a digital zebra in postproduction. “We replaced it with a CG zebra because of the interaction with the digital animals climbing on top,” Westenhofer says. The practical zebra provided lighting reference, though, for the artists creating the digital double.
The hyena next kills an orangutan that had landed in the lifeboat as well. “We shot a real orangutan in a zoo in Taipei for a scene in which she’s in the cargo hold of the ship,” Westenhofer says. “We brought the set to the zoo. For the rest of the film, she’s a digital creation.”
And then, a digital Richard Parker bursts out from under the tarp in the lifeboat and kills the hyena. “The tiger is still under the influence of drugs taken on the journey to sedate him,” Westenhofer says. “So in that 30-second shot, Ang wanted the tiger to kill the hyena, feel fear of the ocean, and have Pi in the way of his getting back under the tarp. What would a tiger do when it bursts out of a tarp and sees that it is surrounded by ocean? It would be shocked. Selling all those things was a challenge.”
For this and other shots, the animators looked for examples of how real tigers act. “We scoured reference footage to find clips,” Westenhofer says. “We’d find bits and moments for whole shots or little segments. We wanted to keep the animalistic quality.”
Rhythm & Hues artists also created sea life that interacts with Richard Parker and Pi through their journey, including flying fish, a whale, and a large surface-dwelling fish known as a mahi-mahi or dorado. “We pushed Massive to a new level for the flying fish,” Westenhofer says. “The software had a hard time dealing with different behaviors above and below the water and a surface that changes over time. We needed to have the fish dive into the water, fly around the boat, and skitter around the surface.” For those shots, the crew created a digital boat to have one complete package for raytracing.
At one point during their journey, Pi catches a dorado, and Legacy Effects built an animatronic fish to give Sharma something to work with on set. “We had a cable operator pulling on the animatronic fish as hard as he could,” Westenhofer says. “We replaced the tail and the body beyond the mouth and where Sharma grabbed it, and made [the digital fish] more energetic.”
The whale, created at R&H, found its way into a dream sequence crafted by BUF artists supervised by Mike Fink. “It’s a stylized journey,” Westenhofer says. “We see a shark swim by and a squid come into frame. They attack each other, turn into an amalgam of animals, and then break apart into a hippo, zebra, giraffe, and we’re now in a dreamy sequence. They go deeper and deeper, and we see plankton in the shape of Pi’s mother’s face for a second. We see a ship at the bottom of the ocean and then pull back into Pi. It’s a very pretty sequence.”
Rhythm & Hues artists animated the whale, and BUF artists animated the animals during the transition. “They had a shrink-wrapping system to have the skin of the whale wrap the other animals and then break apart,” Westenhofer says.
During another shot, bioluminescent jelly fish surround Pi and a whale breaches, spinning phosphorescent algae off its tail. Thinking about reference for that shot brought a lovely memory back to Westenhofer. “We had been doing reference shots of the raft to practice during the day, and that night Ang took us out on a boat across the water. He said, ‘Look!’ He waved his hand in the water, and we saw this awesome bioluminescence. That was the motivation for the whale shot.”
Later, there’s also a bit of bioluminescence on the island, which evokes Pi’s memory of the whale. “Ang has symbolism that carries through the film,” Westenhofer says and offers an example: A statue of the god Vishnu sleeping on a bed of snakes that subtly appears in various places.
To create the woods on the island, set dressers at R&H used new instancing tools in Ren, the studio’s rendering software. “The metaphor is a banyan tree,” Westenhofer says. “We completely replaced a set piece to have one organism with roots folding up and instancing out with modifications.”
Massive helped the crew animate the 60,000 meerkats on the island. “They were the most fun to animate,” Westenhofer says. “We made ourselves laugh.” In fact, the animators added a sly joke. “The meerkats are climbing all over Pi. When he first stands up, all the meerkats look at him except one in the foreground that’s looking the wrong way.”
For the skies here and throughout the film, the R&H crew drew from a library of 120 skies stitched together from HDRIs taken on a beach in Florida, with an eye toward making sure none of the images had boring sections that might reflect into the water as a camera panned.
“We had someone sit on the beach with a Canon 5D equipped with a motorized rig and take photographs when nice clouds went by,” Westenhofer says. ‘We had 18k HDRIs that we could use as cyces and actual backgrounds. Skydomes that we would raytrace into the water. We’d animate cloud movements and break them into different layers.”
Then, the trick was finding a sky among the 120 in the library that matched the mood Lee wanted to evoke at that moment in the film. “He’d say, ‘This wants to be sunny,’ or use adjectives like ‘melancholy,’ ” Westenhofer says.
It’s rare for people to pick adjectives like “gorgeous,” “exquisitely beautiful,” and “seamless, dreamlike” to describe a film in which visual effects play such a large role. But that’s often the case with Life of Pi. Writes R. Kurt Osenlund in Slant magazine, “The film’s own saving grace is its feast of magnificent imagery, which, in Lee's hands, rights wrongs by being something close to holy.”
Life of Pi is a film that Lee couldn’t have made without computer graphics, couldn’t have made at all a few short years ago, and having been made, positions the crew in line for Oscar recognition. “You don’t often get the chance to create art through visual effects,” Westenhofer says. “It’s one of the most rewarding feelings I’ve had in a long time.”
Barbara Robertson is an award-winning writer and contributing editor for Computer Graphics World. She can be reached at
BarbaraRR@comcast.net.