Water World
Issue: Volume 35 Issue 3 April/May 2012

Water World

When it comes to films requiring oceans of simulation—features like The Perfect Storm, Poseidon, and Pirates of the Caribbean—in which the sea takes a starring role, Industrial Light & Magic has consistently hit a high-water mark. But, the water simulations required for those films were nascent compared to the work required for Universal Pictures’ Battleship.

Much of Battleship takes place in a square of the Pacific Ocean, blocked off, like a game board, by a force field generated from an alien ship—the aliens have responded to an innocent communication from Earth in a fierce way. They send a reconnaissance team to see if Earth is worth invading. As in the classic board game upon which director Peter Berg loosely based the film, six ships battle within the force field: three destroyers—two from Japan and the John Paul Jones from the US—and three alien ships called “Stingers.” In addition, the USS Missouri and a fourth alien flagship play a large part in the film. And so does the water.

“One morning Pete [Berg, director] said that the alien ships would stand on the water and move like water bugs,” says visual effects supervisor Grady Cofer. “And, they’d recycle water. This was his idea. So, we needed water constantly rippling over the surface. And, the ships would be in lots of shots. We needed to re-think how we did fluid simulations. We knew we’d have to raise our game.”

The alien ships have two types of weapons. In a nod toward the board game, the Stingers fire large cylindrical pegs that cause massive destruction when they hit something. The second weapon is a “shredder,” a ball made of rotating bands covered with blades. The shredder has a bit of intelligence, can move in any direction, and chops everything in its path to bits.

In fact, during a climactic battle, a shredder rips through a destroyer, cuts it in half, and the two sides sink into an ocean now filled with life rafts, CG swimmers, debris, and explosions. The water is filled with splashes; there’s mist in the air and foam everywhere.

“I think that was the most complex shot we’ve ever done at ILM,” says digital production supervisor Doug Smythe. “It’s massive, 3034 frames long. We ran 67 fluid simulations using 19tb of simulation data, and we had 294 unique render elements in the comp. It’s a monster. But, it looks good.”

Smythe won an Oscar for the visual effects in Death Becomes Her, has received three Sci-Tech Academy Awards, and his credits include work on Terminator 2, Star Wars: Episode I, The Perfect Storm, Pirates of the Caribbean: Dead Man’s Chest and At World’s End, Iron Man, and two Harry Potter films, among many others. He joined ILM in 1988. So, when he says “most complex shot,” that opinion holds water.
 
“In the past, we always saved the big shots until the end,” Smythe says, “the really, really hard shots. We knew we couldn’t do 300 shots all in that category and we couldn’t have the two or three smartest simulation experts in the studio doing all the shots. We have a lot of solid technology from the last round of shows for modeling, creating materials, and so forth. Water was the biggest unsolved problem—to do the simulations on the scale we needed.”

Speed and Reliability
ILM’s system has evolved from a computational fluid dynamics solver called PhysBam, created by a group of researchers in the computer science department at Stanford University led by Ron Fedkiw. The studio had first tried the particle-level set system (which researchers at Stanford continue developing and ILM continues using) to melt liquid chrome in Terminator 3: Rise of the Machines (2003) and pour a glass of wine down a skeletal pirate’s throat in Pirates of the Caribbean: The Curse of the Black Pearl (2003). PhysBam moved into the mainstream when it poured digital water off a magical ship as it lifted from a CG lake in Harry Potter and the Goblet of Fire (2005). For that film, ILM bought a four-processor, 32mb workstation specifically to run the simulation.
 
A single machine couldn’t handle the water needed to roll the Poseidon, though, so for that 2006 film, the R&D team engineered a way to split the simulation for each frame into multiple pieces that could run on different processors; that is, they parallelized the fluid solver code to achieve high-resolution splashes in a large volume of water. Later, to create a maelstrom in Pirates of the Caribbean: At World’s End (2007), the engineers and technical directors manipulated the simulation grid. But even so, each final high-resolution simulation could take as many as three to four days to calculate. And always, simulation experts needed to run a separate system to create splashes, spray, mist, and foam.

One of those simulation experts, CG Supervisor Willi Geiger, joined the Battleship crew 18 months before the movie’s release date. Geiger had set up the pipeline for Poseidon, had worked on the second Pirates film, consulted on Pirates of the Caribbean: At World’s End, and helped create a wave of water that flooded Washington DC in Evan Almighty.

“I looked at the bidding sheets for Battleship and compared that work to the scope of the work we had done before and how long it had taken,” Geiger says. “That’s when I started panicking. It was double anything we’d done before, and we had half the time. I realized that we’d have to rebuild our tool set to have any hope. It was the longest and most challenging project I’ve ever been involved with.”

To confront the problem, ILM formed a “Battleship Water Department.” Among the team members were Smythe, who oversaw all the areas of CG technology; Geiger, the CG supervisor; Nick Rasmussen, who leads the effects simulation group in ILM’s R&D department; R&D engineer Rick Hankins, who stayed with the show from system development all the way through final shots; and R&D developers Andrew Johnson and Cliff Ramshaw.

“When Willi came to me, Nick Rasmussen, and R&D in general, we looked at what they wanted to do, the time frame we had for each shot, and the scope of the work,” Hankins says. “We’re production-driven, so the work the developers get is what’s most important at the time. But, some of us have fluid and
water specialties, so we were pretty excited about getting to work on new water simulations. It wasn’t as scary to us as to someone like Willi and Doug, who have to deliver shots.”

Geiger gave the R&D team a mandate: make the water simulation more accurate, detailed, precise, reliable, and five times faster. That is, the time from the artist starting a shot to delivering a simulation needed to be five times faster. To achieve that goal, the team spent between six and eight months developing new “sub-tools” and new methods.

“We realized that we could hold our largest sims in 32gb of memory or so on a single machine,” Smythe says. “That wasn’t the limiting factor. The biggest limiting factor was time. It was how long we wanted to wait, not how much memory a machine had. So we decided to make the simulations faster. And then, they could be as complicated as we wanted.”

Although ILM’s engineers had parallelized the fluid system for Poseidon, it wasn’t multi-threaded. “We based our previous system on MPI [Message Passing Interface specification] for communication between different processes on different or multiple machines, but relied on the infrastructure built on top of our rendering scheduler to grab machines and spread the work out,” Hankins explains. That meant the artists couldn’t predict how long a simulation would take; the result depended on available resources.

“If you couldn’t get all fast machines, the weakest machine in the group would slow the whole simulation down,” Hankins says. “With a threaded model, we could run the simulation on one [multi-core] machine and predict the simulation completion time. But we had to learn how to do threading properly. We went through every single algorithm—and, a fluid system can have many components—to find the best way to decompose the problem. MPI was sometimes the best solution, sometimes not.”

When they finished, the difference was dramatic: “With the old-style system, we had to grab exclusive use of as many as 16 machines and lock everyone out of them, and coordinate those 16 machines for however long the simulation needed to run, which could be days, and it just became impossible,” Geiger says. “Now, a simulation that might have tied up eight machines for two days can run on one eight-core machine and finish in a few hours.” Moreover, because the simulation ran on one machine, it was more reliable. So, check two items off Geiger’s list: speed and reliability.

“Multi-threading was without a doubt one of the biggest things that allowed us to have any hope of finishing this movie,” Geiger says.


At top, Industrial Light & Magic artists simulated water using new tools, fire using the studio’s
Plume system developed earlier, and (at bottom) destruction with existing rigid-body dynamics
systems to create action sequences such as these for Battleship.


Details, Precision, Accuracy
And that was only the start. “We measured where all the time was going,” Hankins says. “We knew we would have to bring in junior artists, not just high-level TDs, so the simulation needed to be less technical from the user’s standpoint. It needed to behave correctly.”

Until this film, to create the ocean waves, splashes, spray, mist, and foam, the simulation artists would run a base water simulation, which provided the undulating surface, and then separate particle simulations for splashes, spray, mist, and surface foam, which produced the scale, complexity, and volume. “We’d have all these individual simulations, which was a headache for the artists and for the compositors down the road who had to piece all the parts together,” Geiger says. “And, of course, it wasn’t physical. Water doesn’t behave in different ways. Spray turns into mist, lands on the water, and turns into foam.”

The goal for the new system was to streamline the workflow and, at the same time, make the water simulation more accurate. To do this, the developers decided to split the sim into two parts. One part, the base simulation, would, as before, create the surface and, with some secret sauce overlaid on that simulation, the splashes. A second group of sims driven from the base water splashes would create the spray, mist, and foam.

 Supervising the Show

Grady Cofer and Pablo Helman shared the job of supervising the visual effects for Battleship. Cofer was involved from the start in 2009 when director Peter Berg pitched the idea to Industrial Light & Magic. Helman joined the crew later, after he finished work on Airbender.

All told, the film had 1500 visual effects shots, with ILM creating the majority. In addition to a crew of 250 at ILM, the supervisors and Animation Director Glen McIntosh oversaw work from three other studios: Image-Engine, the Embassy, and Scanline. The effects consisted of the water simulation, creature animation, ships, and destruction. The creatures were linebacker-sized humanoid aliens but with four fingers spread around their hands like a claw; they usually appeared wearing a full suit of armor.
 
“Peter Berg was the right director for this project,” Helman says. “He loves football and he loves the Navy. When you put that together with the energy he has, you get an action movie, and action movies are difficult. It isn’t just about quick cuts. It’s about the story and caring about the characters that are fighting.”

Both Cofer and Helman spent time with the Navy, filming from boats and helicopters during the semi-annual RIMPAC naval exercises near Hawaii in June 2010, during which 32 ships, five submarines, 170 aircraft, and 20,000 personnel from 14 countries spent a month carrying out maritime operations that even included the sinking of ships. During that month, Cofer was on a destroyer for 10 days, while Helman filmed from boats out of Pearl Harbor and from helicopters. “It was great,” Helman says. “We could internalize how big the ocean is, the hugeness of those ships, and how hard it is to maneuver them.” Footage filmed during RIMPAC became a source for background plates, textures, and reference.

Back at ILM, Cofer and Helman worked in adjoining offices, and both, having moved into their roles as visual effects supervisors from digital compositing, reviewed and fixed shots on Autodesk Inferno systems. “We share the same aesthetic,” Helman says. “We’d look at a shot and point to the same things. We’d put in camera moves, lens flares. It’s easier than giving notes.”

–Barbara Robertson

“We had used the base particle-level set method that Stanford pioneered for years,” Hankins says, “long enough to be clear about the difficulties we always had in production. So, we added another layer to it that helped us achieve splashy structures out of the box.”

The base simulation might typically run inside a virtual 1,000 x 2,000-foot cube, with a grid dividing the volume into two-foot-square cells. An artist inserts geometry representing ships or other elements into the volume and sets the fluid in motion by specifying such parameters as wind and velocity. As the fluid moves through the cells in the volume, the simulator derives an implicit surface. Now, in addition, a particle-based fluid simulation coupled to the base simulation kicked in when the grid was too large to create detailed splashes.

“Basically, the system knows when the under­lying implicit surface is under-resolved,” Hankins says, “that is, when the grid can’t capture the detail the simulation requires. We identify that and generate new particles to fill in the missing detail.”

These new splash particles are fluid solvers in their own right, with dynamics governed by an optimized system the team developed specifically for this purpose. They can have their own velocities. They’re fast. And they can affect the underlying surface, as well.

“The key realization was that the water in a simulation needs to know about its surroundings,” Geiger says, “the bits of water near to it. But, it doesn’t need to know about the water miles away. So, rather than taking a huge grid and subdividing it, the system dynamically builds little grids, sometimes thousands of little grids, exactly where we need them. The small grids can have whatever resolution we want. We would get down to features a few inches in size.”

That gave Geiger the detail he had asked for. And, because the splashes were part of the base simulation, they were correct. Cross off precise and accurate, too.

“This is all very similar to the way the particle-level set naturally works,” Hankins says. “So, in that sense, it’s not new. What’s new is how we were able to smoothly and seamlessly blend these particles into the underlying base-level set. The way we did the dynamics gives them a nice splashy structure so the artists didn’t have to spend a long time to get a good look. It worked out of the box, but they still had controls.” Controls affected such characteristics as how loose or tight a splash would be, how big or small, or how clumpy.

The second part of the simulation generated spray, mist, and foam: The splash particles emitted spray particles, and when the spray particles fell on a surface, they generated foam. “We set up the pipeline to do it automatically, but we didn’t run it as a single simulation,” Hankins says. “An artist would typically work with a low particle count, a few million maybe, to figure out if the motion and the balance between spray and mist were correct, and if it needed additional forces. “It was something they could turn around quickly at a workstation. But for rendering, we’d crank up the level of detail with, say, 300 million particles on eight to 12 [eight-core] machines. It was a trivial parallel problem because the machines could work independently.”

In addition, the artists ran separate simulations of the air around the water and coupled that to the base water simulation to produce swirls and eddies. One of those artists was Hankins. “This was the first time I’ve been able to be on a film from the beginning of R&D all the way through production,” he says. “Mostly we do R&D and then pass on our work to production. Being able to see something through to the end was awesome.”

At the end of this process, the datasets were so big that the previous system, which figured out which cells contained fluid and generated a mesh surface for rendering, bogged down. “We had to rewrite the mesh-generation part of the rendering system, as well,” Geiger says. “But, actually, it uses the same idea. Rick [Hankins] said, ‘Instead of having one huge grid, let’s dynamically build as many small grids as we need. We can build the surface in tiny, tiny parts relative to the whole thing, and then stitch them together.’ It was more work because we had to stitch all these small parts f the surface back together, but because we are processing all the parts of the surface differently, they could easily be multi-threaded; we could send them off to their own core. So, suddenly, the surface reconstruction became much faster, as well.”

Because the other teams would render the hard-surface objects, the environments, and the creatures in Pixar’s RenderMan, and because he has worked with RenderMan for years, Geiger stayed with that program— Version 15, which was available when they started—to raytrace the water.
 
“We had various ways of cheating,” Geiger says. “For example, to avoid fully raytracing all the geometry, we’d run a pre-pass, where we essentially calculated the thickness of the water at each point in the sea to know how much volume of water a ray would pass through. Often, explosions, not the sun, were the brightest lights. So, we’d cache out illuminations of explosions and use them to scatter light. That way we got the global illumination and still persuaded RenderMan to run.” But, if he had switched to another renderer, he would have had integration problems with the other elements.

“There’s never an easy way,” Geiger says. “You just pick the least difficult.”

To help speed production, the crew used pre-baked animation cycles and pre-baked simulations in distance shots, and sometimes shading techniques within RenderMan to create water without running the simulation.

“We initially came up with the idea for Pirates 3,” Smythe says. “We use a noise modulation in the displacement, and then by animating the bands of noise modulation, you can get something that looks like flowing water. You have to make sure the space you’re computing in is correct, though, so if a ship flips over, the water still flows downward.   

 Ships and Shredders

Modelers at ILM received McNeel Rhino files from the film’s art director, Aaron Haye, for the alien battleships and flagships. “Aaron used to work in the ILM model shop,” says modeling supervisor Russell Paul. “The designs came to us fairly complete.” For the US and Japanese destroyers, the artists used photographs of the current Arleigh Burke class of destroyers, as well as photomodeling, as a starting point for the models.

“We then did some in-shot work to align our CG versions to the plates taken during the RIMPAC exercises,” Paul says. For the interiors, the modelers matched the sets.

The USS Missouri was the most challenging ship. For this, the modelers had a 3D scan of the entire dry-docked, World War II battleship. “It was a raw cloud of points,” Paul says. “There was one point for every 20mm of the 900-foot-long ship; a lot of data. So, we did a loose processing, not a detailed mesh.”

The filmmakers towed the ship out of the dock and into open water for some shots, and for those shots, ILM replaced the water with digital simulations to make it appear as if the ship were moving. Modelers also added moving parts—a radar dish and gun turrets. But, some of their most difficult work was in building detailed, damaged versions of the ship. “We tried to find a balance between having a model detailed enough to sell the look and having a model that could still get rendered,” Paul says. “We didn’t model all the teak boards on the deck, for example. They’re only a few inches wide. But we did model some of the boards around a damaged area.”

As for the shredder, although it looks complex, the repetitive design meant it was relatively easy to model. “It has lots of little blades, and each one has controls so you could get a different feel,” Paul says. “But, it’s so repetitive; we could copy the blades and stack them up. The difficulty was in achieving the look.”

–Barbara Robertson

In Process    
Animators started the process by keyframing the ships through the water. “That drove every­thing,” Geiger says. “They had to be aware of how a ship moves.” Although a few shots were completely synthetic and the alien ships were always CG, most of the shots started with something real.

The alien ships had the biggest performances and generated most of the simulation work. “They are 500 feet long and weigh 50 tons, and they can jump,” says Glen McIntosh, animation supervisor. “We needed to build character into what we were animating.”

The destroyers were often real, filmed during RIMPAC naval exercises, with CG ships added or replacing those in the footage. “Generally, we’d do a layout pass that would place low-resolution representations of all the ships in a scene, either matching ones in a plate or using representations of ones we’d add,” Smythe says. “That layout would include the water plane, the camera matchmove, and the horizon line. Then, if that was all we needed, the layout team would animate the ships
according to the storyboards or post-vis.”


The shredders spinning above look complex, but their size and symmetry made them easier for
the ILM modelers to create than the highly detailed, 900-foot-long USS Missouri or the much
smaller but asymmetrical body armor for the aliens.

 
That layout then went to various places in the pipeline—the lighting TDs, the water simulation team, and the effects TDs creating explosions, debris, and other simulations. When the ships did little movement and when they were in the background, the full water simulation wasn’t required. But, the alien ships were often part of the action—the Stingers, for example, jumped out of the water like water bugs, causing water to stream off the sides. Then, artists placed collision geometry, which lined up with the animation geometry, into the fluid simulation. Easier said than done.

The complex 3,034-frame shot Smythe described earlier started with animation of the destroyer buckling in half and then breaking in two. Animators moved the two pieces up through the water according to an imagined force and drew an orange sphere to represent a fireball of an explosion. That animation file then moved on to the creature department, where artists fractured a ship model and created debris, and to the effects team for water and fire simulation.

“For the most part, we had a one-way pipe,” Geiger says. “We had separate simulations for fire and water. We didn’t couple the simulations because we wanted to have flexibility.”
For the explosions and resulting fire and smoke, the artists used ILM’s Plume software, a GPU-based system. The system, widely used on previous films, allows the artists to generate quick iterations and, therefore, more easily art-direct the simulations.


Director Peter Berg wanted the aliens to be flawed characters, not monsters or superheroes,
as if they evolved along a similar but slightly skewed path. Animators started with iMocap data
captured from stunt actors on location and then concentrated on the aliens’ hands and eyes
(visible behind the visor) so the audience would know the armor-clad creatures weren’t robots.


“We couldn’t believe how quick the artists turned them around,” Geiger says. “We’d attach fire to debris and use debris to drive an air field. In this film, the whole was much greater than the sum of the parts.”

Cofer says he knew the team could make the film when they started working on a shot for the trailer of the alien ship breaching the water. “We had done a splash test with our new simulation tool set and an alien ship,” he says. “We were getting scale and level of detail, but we weren’t there yet. And then Pete [Berg] said he wanted the ship to breach close to the camera, with water splashing off it. So, we just started doing iterations and fine-tuning. Then one day, someone tried a different parameter and it worked. And we kept elevating the work after. I think the results are astounding.”

By speeding the simulation process, the Battleship Water Department made it possible for the supervisors to make more creative decisions. “Before, we saw simulations separately in five viewings,” says Pablo Helman, VFX supervisor. “Now we see everything. When you can see all [the splashes, mist, and foam] in one [simulation], you can make specific comments. It’s always about feedback. And the faster we can do the simulations, the more accurate we can be to the vision we have.”

 Aliens!

With cat-like eyes visible behind the visors of their helmets, hands like claws, and body armor, these huge, unwelcome visitors from another planet look human enough to be believable.

“Pete [Berg] is an athlete, and he loves sports,” says visual effects supervisor Grady Cofer, referring to Battleship’s director. “He wanted athletic, powerful creatures that weren’t so different from us. He wanted creatures we could relate to.”

On set, ILM used its iMocap system to capture the performance of stunt actors playing the aliens. Even though the digital thugs would be seven feet tall and weigh 450 pounds, the actors gave the animators a basis from which to work. “We took the essence of their performance and applied it to the characters,” says Glen McIntosh, animation director. “It’s as if they’re from a planet with a parallel evolution, slightly skewed. They have dirt under their fingernails.”

Frank Gravatt supervised modelers who worked from maquettes and concept art to create the aliens’ faces and hands.

“We built on technology and rigs we’ve had in the past, and we have a strong modeling and painting crew, but there were a lot of aesthetic challenges, and we had many revisions,” says Doug Smythe, digital production supervisor. “We probably went through a dozen iterations on the land commander’s armor alone, trying to figure out the color and patterns, whether the aliens could see through the visor, and what the visor was made of. Trying to nail down the look was far more difficult than the actual implementation.”

Russell Earl supervised the modelers who built the armor, which they placed on a complete body that was modeled beneath. “The armor was particularly challenging because it wasn’t the same on the left side and the right side,” Earl says. “So we had to build both sides individually, which added a lot of time to the schedule. We lost the advantage we sometimes count on.” Riggers and texture painters couldn’t count on being able to copy from one side to the other, either; the asymmetry affected artists throughout the pipeline.

Once animated, the aliens moved into the creature department, which hooked the armor pieces together, added steam to show the aliens’ breath, and fixed any shape interpenetrations and cracks in the surface. At first, the director wanted to give the aliens dry skin. “Dry skin looks artificial,” Smythe says. “But, he didn’t want them to look slimy.” New shaders provided the creative compromise.

–Barbara Robertson

Barbara Robertson is an award-winning writer and a contributing editor for Computer Graphics World. She can be reached at BarbaraRR@comcast.net.