The Matrix Resolution
Issue: Volume: 26 Issue: 12 (December 2003)

The Matrix Resolution

By Barbara Robertson

Photos courtesy of Warner Bros.
©2003 Warner Bros.-US-
Canada-Bahamas-Bermuda; VillageRoadshow Films (BZI), Ltd.


The complex machine world, the spiny machine god, and the swarms of bugs emanating from the godface, were created with computer graphics at Tippett Studios. Only Neo (actor Keanu Reeves) and the set upon which he's standing were real.




The dark, complex science-fiction trilogy about revolutionaries who break free from a life ruled by machines that suck energy from humans hooked to virtual reality hoses struck a worldwide chord: The first two films— The Matrix, released in 1999, and The Matrix Reloaded, released this past May— have earned $1.2 billion in box-office revenues. Even though the eagerly anticipated Reloaded did not receive critical acclaim, the visual effects continued to be highly praised. The same was true for the third film, The Matrix Revolutions, which also won accolades for its visual effects and box-office success. Thumbing their noses at the critics, Matrix fans supported Revolutions to the tune of $202.8 million worldwide through its first weekend, making it the biggest global film opening ever.

All three Warner Bros. films were written and directed by brothers Larry and Andy Wachowski, who brought the trilogy to its resolution in The Matrix Revolutions with a war between free humans and machines. The war takes place on two planes: while the free citizens of Zion battle machines in the real world, Neo (Keanu Reeves), the film's star, battles his nemesis, Agent Smith (Hugo Weaving), in the simulated world. Effects for Revolutions were created primarily by ESC and Tippett Studios, with Sony Pictures Imageworks, BUF, and Giant Killer Robots also contributing shots. Visual effects supervisor for all three films was John Gaeta, who won a visual effects Oscar for The Matrix in 2000 (see "New Realities," pg. 56).

"Visual effects has to invent and innovate at a blistering speed," Gaeta says, citing three areas of innovation in Revolutions: virtual humans, proceduralization of creature animation, and algorithmic environments. "I don't think there can be anything cooler in 2003 than gigantic robot wars done in the most extremely detailed photorealistic wildness," he says. "And, we long ago thought it would be fascinating to have machines grow their city algorithmically."

Although visual effects are used throughout Revolutions, three sequences near the end highlight these innovations: The siege, a major battle between humans and sentinels (squid-like machines), created largely at ESC with Imageworks handling skirmishes in tunnels leading to the subterranean battleground; the machine world where Neo negotiates with the machine god, created at Tippett Studios; and, a high-flying martial arts duel in a mega city between Neo and Agent Smith, created at ESC.

Near the end of the film, humans who have freed themselves from machine control face a terrifying attack when the digger, an enormous drill bit, cuts through the ceiling of their underground docking station. The digger was built as a miniature and replicated in CG; the "dock" was created as a set piece for live-action shots, as a miniature for motion-control shots, and as a complex digital model created at ESC. "We tried to mix it up," says Gaeta, "to have an organic aesthetic for this battle for Zion. Everything is digital, but we acquired elements from art-directed sets and used miniatures to get that gritty organic quality."

When the ceiling opens, thousands of sentinels swarm through. The sentinels, always CG, were created and performed for the siege sequence at ESC, although Tippett Studios and Sony Pictures Imageworks created sentinels for other shots.

The Zionists battle the sentinels from seats in maneuverable 15-foot-tall robots called APUs (armored personnel units), one APU per person, firing at the swarms by aiming the device's mechanical shooting arms, and moving into new fighting positions by driving the contraption's mechanical legs. To put people into the robots, ESC used a variety of techniques.

A system developed by an ESC effects team created tracers by firing a predetermined number of particles per frame based on the number of rounds per second fired by the APUs. Animators controlled the direction and set the timing for the gunfire.




For the first scene in which the APU army is revealed, the effects team used a life-sized model of the anthropomorphic robot. By filming actors and stunt doubles in this APU and later replicating them, the crew created hundreds of warriors ready for battle.

For most shots in the battle, though, the APUs were computer-generated. "When you see an actor climbing into the carriage, it's a physical APU, but once the APUs start moving, they're a combination of CG and live action," explains George Murphy, visual effects supervisor at ESC, along with Joe Takai, for the siege sequence.

The trick was to match perfectly the actions of live-action actors with the animation of their CG machines. Working from rough animatics created by Gaeta and PLF, ESC animators created the robots' performances. Separately, a six-axis motion base was devised for the actors to ride. The crew then created real motion from the virtual by using animation data. "We took the animation all the way to the approval stage in terms of what would affect the motion carriage," says Murphy, "and then programmed the motion base with that data." Twelve actors were filmed riding the pre-programmed motion base, including Gaeta, who can be seen getting attacked by a sentinel during the final stages of the battle.

For APUs farther from camera, the crew put digital doubles into the driver's seats, applying motion-captured data to the CG characters with the help of Kaydara's Filmbox, rather than compositing live actors filmed on the motion base. "We had a setup in Sydney where we could motion-capture large groups of people doing various operations, such as firing guns," says Murphy. "We also used motion capture to populate the dock with humans. When you see people running into a tower, they're all motion-captured."

Actors rode on motion bases programmed with animation data from the digital APUs so that the two performances would match.




The CG APUs were built by a nine-person modeling team led by Brian Freisinger that created some 1000 models for Revolutions using NewTek's LightWave and Alias Systems' Maya. The APU alone had between 200 and 300 moving parts, including guns, hoses, cables, chains, and bullets. Animators keyframed the APU performances in Maya, and then body parts and accoutrements were handled by effects animators using rigid-body and soft-body dynamics. "John [Gaeta] and the brothers had a pretty wild sense of how stylized they wanted the performance," says Murphy.

The battle sequence begins with a shot of 100 digital APUs. "We started close to the back one, so it had to have very sharp detail," says Takai. "Then, we pulled back through the APUs." Because the shot also includes 300 digital people and the complex CG dock, it was one of the most difficult to render in the siege. "We did some resolution swapping and we rendered it in layers, but we still had to render 100 APUs with moving parts, and each one had something like 2000 mirrored surfaces and 12 gigs of texture maps. It was about a five-week render cycle."

When sentinels swarm in, the APUs raise their arms, start firing, and all chaos breaks loose. Camera moves and actions for the hero characters were choreographed in previz by Pixel Liberation Front (PLF) working with Gaeta and the brothers at EON, the film's development studio, but concept art was created at ESC. "The siege was such a complex battle, EON did 3D animatics rather than storyboards, but they were simple renders. The color and lighting was not worked out at all," says George Hull, art director at ESC.

Hull spent a year and half working at EON on storyboards and paintings for Revolutions' machine city, which was created largely at Tippett Studios. But when production began, he joined ESC where, working in Adobe Systems' Photoshop, he turned keyframes from PLF's 3D animatics of the siege into paintings. "Computer imagery is by nature noisy because you see every detail," he says. "The challenge was to reduce the visual clutter." Hull's images, which often obscured much of the background with smoke to highlight visual story points, became guidelines for compositors.

The swarms of sentinels were animated using a Maya-based behavioral system developed by Mike Morasky. "We had more than 100 shots with thousands of flying sentinels," he says. "By using the Maya particle system, we could create them, place them, give them a life, and then kill them using Maya's forces and emitters." Animation cycles created by keyframing and through "destruction procedures" were used to propel sentinels in the swarms, and nearly 30 parameters could alter their behavior over time and space.

When hit, sentinels shattered with the help of procedural rigid-body and particle simulations. "When we wanted to see thousands of bits of metal flying off, we did a particle simulation," says Takai. The sentinel's break-apart tentacles were built from a detailed cylinder instanced down a chain.

To manage ballistics for the hellish war, a digital destruction team led by Adam Martinez developed a gunfire system. Working with bounding box proxies for APUs, sentinels, and other elements, the team added gunfire, tracers, smoke trails, and debris from the hits to match thousands of rounds fired by animated guns on the APUs. "A system written by one of our artists determined whether a bullet hit concrete, metal, or a flying sentinel, and put the proper effect in that area," says Martinez. Particles textured with density equations created smoke; particles rendered with a white-hot shader created sparks.

To pull these elements together, six sequence leads managed different parts of the siege. Andrew Harris describes the process for one series of shots, Mifune's last stand: "Asset teams would bring in the camera and import all the elements on a checklist to assemble the shot. Then, a TD would place lights and run test renders, basically one TD per shot. We broke everything down and rendered in layers because there was no time to re-render an entire shot."

While the residents of Zion battle sentinel swarms, Neo and Trinity (Carrie-Anne Moss) pilot a ship through underground tunnels on their way to the machine world, flying past fetus fields (last seen in the first film). Once above ground, they face fire from an armada of sentinels and "tow bombs," escape by flying up through dark chemical clouds to see blue sky, and then dive down to crash inside machine city where Neo meets the machine god, the deus ex machina. The entire complex, intricate, immense landscape for machine world and machine city was created by Tippett Studios, as was the fight with the sentinels on the way, the fetus fields, the chemical clouds, the blue sky, the denizens of machine city, the spaceship, and the machine god.

Helping coalesce the look of this world with thousands of elements was Grant Niesner who, like George Hull, was involved in pre-production design at EON where, among other tasks, he turned Geoff Darrow's APU drawings into a working 3D design.

"If there was any one square inch of a monitor that didn't have at least two dozen different types of elements in it, then the shot was probably incomplete," says Tippett Studios' Craig Hayes, visual effects supervisor of machine world. Of the 143 shots in the sequence, only 52 had background plates; and of all, machine city itself was the most demanding.

Animators brought into play anime, gangsta rap, and industrial robots when creating the APU's stylized performance.




To ensure continuity, the crew built a low-res "master set" for the 100 mile-wide city that could be viewed from multiple cameras. Changes made by set dressers for specific shots were updated on the master. "Once we got approvals, we butchered anything in the city that wasn't appropriate for the shot," says Hayes.

To build the city, effects lead Dan Rolinek developed systems that assigned building parts modeled in multiple resolutions to particles in Maya. "A huge area of the city needed to be encrusted with buildings and towers," he says. "So we didn't build real geometry. Instead, we emitted particles that represented building parts." One system kept track of the parts, another figured out where the bases were. Segments were stacked to form towers, and branches added on the sides bent toward the light. Although the crew could see low-res models in Maya, using RenderMan's (Pixar Animation Studio) delayed read archive, complex geometry was brought in only during rendering.

Rolinek also led teams that created and fired tow bombs (towed by sentinels), using particle-sphere explosions. Johnny Gibson, CG supervisor, who provided procedural shaders for chemical clouds, fog, and numerous other atmospheric effects for the sequence, created an explosion illusion by applying a procedural shader to the surfaces of the particle spheres. For the attack on Neo's and Trinity's ship by sentinels, Rolinek once again used particle systems to control swarms. "We'd store an animated sequence and instance that on particles at render time," he says.

Hero sentinels were handled by animators, though. "They were amazing creatures to animate," says Simon Allen, animation supervisor. "They had 15 tentacles, 20-odd eyes, mandibles, and a laser would shoot between their legs. We had to keep the rig lightweight enough to move in the scene but still have enough controls to have them swim, grapple, and tear apart."

For the machine god, modelers and animators worked from reference footage of the Wachowskis' baby nephew using the swarms of machine bugs that helped form its face to punctuate emotional outbursts that sometimes sent the bugs flying toward Neo. "We had 30,000 characters, each controlled by particles with simple behaviors," says Demetrius Leal, effects lead. "We used a lot of algorithms so it wouldn't look like they were all following a path."

Throughout the sequence, effects animators at Tippett worked hand in hand with animators; and animators also worked with compositors, helping to determine, for example, the timing of lightning bolts, which were also used to accent the deus ex machina's emotions. "Their job is to know temporal," says Hayes. "We really wanted the lightning bolts to enhance the drama."

When Neo convinces the machine god that he can terminate Agent Smith, the battle below ends and the showdown between Neo and Agent Smith begins in The Matrix, on a street in a mega city lined on both sides with multiple Agent Smiths. A few Agent Smiths were manikins, some were live-action shots of Weaving that were digitally duplicated, but most were digital doubles created using the image-based facial animation system dubbed "universal capture," developed by George Borshukov, Dan Piponi, Oystein Larsen, J.P. Lewis, and Christina Tempelaar-Lietz for The Matrix Reloaded.

The showstopper in the fight is a punch Neo lands on Smith's face. It happens in slow motion in extreme close-up. As Neo's fist connects with Smith's flesh, Smith's face becomes distorted and you can see the indentation of Neo's fingers in his cheek. "Finally, we could show in full screen all the techniques we've developed in the last three and a half years for facial capture," says Borshukov, technology supervisor.

The universal capture system uses an array of synchronized cameras to capture an actor's performance in ambient lighting. Each pixel's motion is tracked over time in each camera view using optical flow and then combined with a cyberscan model of a neutral expression of the actor and photogrammetric reconstruction of the camera positions. Animated color maps for the model are created by combining images taken from the multiple camera views over time, and surface textures such as pores and wrinkles are added using a 100-micron scan of the actor's face taken with an Arius3D scanner from which bump and displacement maps are extracted.

Artists at Tippett created this scene of harvester machines tending human fetuses by using borrowed harvester parts from the first Matrix film. They also created thousands of other elements and added swamp gas to the fetus fields.




Because scans taken of Weaving were already high-res, the team picked one with an expression that could be manipulated to match a maquette showing Weaving's face deformed with the imprint from Neo's fist. However, because Smith's face would be so close to the camera, sculptor Rene Garcia modeled a new head at twice the resolution of the previous Agent Smith head. Also, the crew needed new universal capture data. Even so, says Borshukov, "We captured at 60 frames per second, but because the shot was so slow (the equivalent of 300 frames per second), we didn't have enough frames. So, we picked a section of performance and did some interpolation on the maps to extend the play time."

Animation supervisor Kody Sabourin extended Smith's performance into the exaggerated expression. "Once we had the performance plugged in, we used a series of wire deformers and clusters to model key frames and animate them over time," he says. As he manipulated the model, the UV map moved correspondingly. "It was really cool. I was working with a human face. I could see the textures and colors change."

In addition to Maya deformers, Sabourin used proprietary tools to ripple Smith's skin and make it fold back on itself. To help make the flesh look alive, the team used a proprietary method for doing subsurface scattering. "Every single pixel in that shot was computer-generated," Borshukov adds. "The rendering ran almost a week because we had water drops in the air and on the surface of the skin, with reflections and refractions, and we had to have full-on three-dimensional depth of field, so it was incredibly expensive."

Because the "super fight" takes place in a massive rainstorm, ESC had to create synthetic rain that matched real rain. Some shots cut from real rain to CG rain, and in other shots CG rain poured down in front of and behind real rain.

Tippett Studios' animators placed cues for lightning bolts that TDs created later so the bolts would accent the machine god's dialog.




"We started by building a raindrop," says Andy Lomas, CG supervisor, "and simulated light moving through it." The team then created a system that built ribbons of geometry from procedural shaders linked to material shaders; the ribbons were driven by Maya particles. To make the ribbons wobble and look natural, system parameters were used to change the amount of sparkle and the width of individual raindrops in the rain streaks. Splashes were created with separate Maya particle systems, and texture maps controlled the amount of spray flying off cloth edges. "We could detect how fast the rain was moving and cause it to emit spray when it hit something," Lomas says.

When the fight takes to the air, Neo and Smith are sometimes actors flying in rigs and sometimes digital doubles soaring above virtual buildings that extend the two buildings on the set. Most often, though, the actors were filmed flying in elaborate rigs on bluescreen stages. To exaggerate the live-action motion, ESC developed a "rear projection" technique that separated images of the live actors and allowed them to fly through the mega city farther than was actually possible. The virtual buildings were created using the image-based modeling system under development by ESC teams that traces back to The Matrix. Images used for the buildings were photographs taken in Sydney at night and in the daylight so that lightning strikes could accurately illuminate textures. In the far background, a cyclorama built with 2D tiles created from the digital buildings was used as a matte painting.

"We had an average of 106 elements in the street shots—the rain, fog, streetlights, buildings, up lights, water passes, water on the ground, splashes, the Smiths, and so forth," says compositing supervisor Matt Dessero. "And then there were night and day passes for the lightning."

Lighting was accomplished with the help of a system developed by Haarm-Pieter Duiker that utilizes onset photography. A multipass system for mental ray developed by Duiker and Thomas Driemeyer of mental images helped speed rendering.






Effects teams at ESC created these shots using CG rain mixed with real rain and by adding multiple digital Agent Smiths.

"Every technique for this show ends up on steroids," says Murphy. "A fair amount of technology came with the people who joined the studio, but we didn't use any one technique. We probably created on the order of 200 plug-ins for Maya, Apple Computer's Shake, and mental ray. This film had the most complex digital environments I've ever had to deal with."





Adds Takai, "Plus, we were building the facility, building the pipeline, and hiring people while we were doing the production. It was like building an F18 in-flight and everyone is in the plane while you're flying."

The Matrix Revolutions, indeed the entire trilogy, has no end of ironies, not the least of which, in terms of effects, was that the teams created a virtual world and virtual humans that look photorealistic by starting with images from the real world. And then they created real worlds that look like nothing you've ever seen before by starting with computer graphics. However, the greatest irony of this film, which tells the story of people breaking free from machine control, might be that the incredibly complex effects could not have been created without thousands of machines. Indeed, most of the effects were produced "in" the machines, and often procedurally. Whether the people creating these effects became slaves to their machines during the course of production is only for them to say. But given a choice, they'd probably do it again.

Barbara Robertson is a contributing editor of Computer Graphics World and a freelance journalist specializing in computer graphics, visual effects, and animation. She can be reached at BarbaraRR@comcast.net.

Next month, in Part 2 of this special report on the making of The Matrix Revolutions, we look at how previsualization was used to choreograph some of the film's most complex scenes.

Adobe Systems www.adobe.com
Alias Systems www.alias.com
Apple Computer www.apple.com
Kaydara www.kaydara.com
mental images www.mentalimages.com
NewTek www.newtek.com
Pixar Animation Studios www.pixar.com infoNOW 148