Kong
No stranger to the MonsterVerse, Scanline contributed to both
Godzilla and
Skull Island, but none of its work involved the title creatures. Conversely, in the recent film, the studio was charged with the Kong asset, which it then shared with Weta and MPC. The story line of Godzilla vs. Kong takes place about 50 years after the events of
Skull Island. So, to maintain a degree of continuity in the character, the artists started with the adolescent and much smaller model of Kong from that earlier film, which was provided by the client, and used it as a starting point, along with real-world references. They then cycled through several concepts before landing on the final “old man Kong asset” that’s in the latest movie.
“The idea was that Kong would be much larger physically and be showing signs of age to reflect the time that has passed,” explains Bryan Hirota, Scanline’s visual effects supervisor on the film. “We bulked up Kong’s model substantially, increasing his muscle mass, and added gray and white hairs throughout his fur, along with various battle scars.”
Early on, a team of artists studied gorillas at the Los Angeles Zoo and gathered photographic and video reference. They also examined images of older gorillas and primates to study how their features change as they age, “keeping in mind that Kong is an anthropomorphized ape and has a lot of human characteristics,” says Hirota. They also studied the physicality of older bodybuilders and weight lifters.
Two setups were used to handle Kong’s muscle simulations. For shots that required fine muscle and tissue detail, the modeled the skeleton, muscles, and tissue with thickness for the fascia and skin, and then group used Ziva Dynamics’ FEM Solver, a physics-based muscle simulator, which gave a believable physicality to the creatures. For efficiency on shots that didn’t require such hero simulations, the artists developed a body muscle system in Autodesk’s Maya with an anatomical procedural jiggle rig that ran in near real time for rapid iteration. The artists could mix the results from Ziva with the real-time Maya jiggle rig on a shot-by-shot basis.
Scanline further developed an auto simulation process for the muscles, jiggle, and hair, which could run over a series of shots to increase efficiency even more.
On the aesthetic front, getting the right look for an aged Kong required some back and forth between Scanline and the director and studio. “We needed to get the right amount of beard coverage and aging in the groom,” Hirota explains.
Kong has a number of different groom states throughout the film — dry, wet, oily, and burnt — which had to be tracked for continuity. Artists generated his hair using Chaos’ VrayHairNextMtl. The groom was done with Maya’s XGen, and the simulation in Maya’s nHair. “We spent a good deal of effort on introducing the aging and placement of both gray and white hairs in his groom,” says Hirota. “Once we were happy with the base look of the fur, we developed the other variants that are seen in the film, including the various stages of wetness and a dusty and oily version from his final confrontation with Mechagodzilla.”
Scanline, in fact, completely overhauled its hair system to allow for interactive manipulation of the guide hairs, and created a multi-shot hair simulation tool. All the fur elements were created by sculpting guide curves. “At the outset, we would start with a smaller amount of guide curves and try to push the groom to about 70 percent completion. We were able to change the parameters of all the grooming attributes on the fly without having to redo everything, as is the case with a purely sculpting-based grooming workflow,” Hirota says. “Although we were using a photorealistic approach, we were still able to utilize some of the purely sculpting-based grooming tool features to add finer details when required.”
As Hirota points out, Kong’s groom was so complex that the team was limited in terms of efficiency and iterative abilities. To solve that, they split the ape’s groom into 10 smaller sections, resulting in faster hair generation and preview times within the viewport. They also harnessed the viewpoint render features in Maya’s Viewport 2.0, which provides full shading for fur as well as lighting and shadow previews. “Being able to view something that closely resembled how the rendered fur would look without having to go through render tests meant we were able to iterate much faster,” he says.
In all, Kong has over six million hairs (6,358,381 to be exact) that were simulated in every shot he is in.
One of the more difficult aspects of Kong, according to Hirota, was due to his human-like qualities, particularly when expressing human-like emotions. “As we knew Kong’s performance was going to be crucial to the story and that he was going to have to convey a wide range of emotions, we dedicated a focused effort into rebuilding our eye model for Kong’s eyes,” he says.
The group worked on accurately replicating the shape of the cornea to refract light and interact properly with the iris, as well as added a thin membrane where the eye veins were so they weren’t simply painted onto the sclera. They further added a tear film and had full control of the meniscus, enabling them to control the mix of oil and water that sits on the surface of the eye. “All these additions meant we were able to get proper colorization and increased realism into our eye model,” Hirota says.
Kong exhibits a wide range of emotional states throughout the film, from tender moments with the human Jia to epic moments of rage. “Especially for moments when Kong needs to express specific emotions, we would motion-capture a full performance for Kong (both body and face),” says Hirota. This was done by implementing a new facial capture system using software from Faceware Technologies, and throughout the process, harnessed the feedback from machine learning to help improve and refine the targeting of the human performance to drive Kong’s face. Additionally, the group referenced a study on primate FACS from the University of Portsmouth to discern the differences between the human and chimpanzee faces.
Mechagodzilla
For this mechanical beast, Scanline received an approved design from the client, but had to turn that into a fully articulated creature that could function in 3D in all of the ways required of him. According to Hirota, the team started by developing all the geometry and mechanisms for how his joints would function without interpenetration. This entailed creating gimbaling surfaces, sliding metal panels, as well as bespoke mechanisms. Given the creature’s size and scale, the artists further applied additional small-scale details to its surface and internal build to reflect its sheer size and construction history.
They also researched, designed, and incorporated various weapons, attack systems, and defense systems that Mech uses in the film, ensuring that those devices worked within the established model. The group then used its in-house Manifest layout system to connect all the Mecha parts together into one master asset, much like a piecing together a puzzle.
As Hirota notes, given the mechanical nature of the creature, it did not have any need for muscle, fat, skin, or hair type of secondary simulation. “However, we did have a method for some dampened vibration of the larger panels and parts of Mecha to give it a bit of complexity,” he says. “Given his larger size and his non-biological construction, we could immediately give him a strength, speed, and flexibility advantage, which opened exciting possibilities we didn’t have with the other titans.”