Studio Up for the Challenge
March 8, 2016

Studio Up for the Challenge

In December 2015, SyFy Network aired the first ever adaptation of Arthur C. Clarke’s  Childhood’s End , one of the most acclaimed sci-fi novels of all time. The three-episode miniseries follows the peaceful alien invasion of Earth by mysterious Overlords, whose arrival begins decades of apparent utopia under indirect alien rule, at the cost of human identity and culture.

Scandinavian VFX and animation studio Important Looking Pirates (ILP), known for its work on Kon-Tiki and currently working on Black Sails, was brought into the project by Childhood’s End Associate Producer and VFX Supervisor Kevin Blank, who had collaborated with ILP on Crossbones and Constantine

As with both of those shows, ILP’s work on Childhood’s End was also recognized with a VES Award nomination. We caught up with ILP Co-Founder and VFX Supervisor Niklas Jacobson to get insight into ILP’s approach to rendering the alien spaceship and volumetrics.

 

What was the rendering challenge for you on this series?

Jacobson: We knew we were facing a challenging sequence in the first episode when the alien Overlords first approached Earth. We made a dozen shots where the Overlord mothership emerges from the clouds, where we needed to render big-scale simulations with the embedded ship. The third and final episode, where we visit the Overlords’ home planet, came with an even greater challenge. It features a big-scale, full-CG environment with plenty of atmospheric effects like smoke, fog and lava rivers, as well as crowds of flying spaceships and Overlords, to make the world feel inhabited and populated.

We went for a full-CG approach on these environments, rather than a 2.5D or matte-painting approach, because we had quite a long pre-production period, but a very short production period. Final shot count and layout was not locked and assigned until very late in the process, so we spent a decent amount of time during pre-production to build a good set of assets in order to swiftly set up and construct scenes once shots were awarded. 

What rendering software did you use?

Jacobson: Arnold was our rendering engine of choice, and we used the MtoA plug-in. We are a multi-renderer facility, and we believe it is important to not be religious about software in general. If you are unbiased and keep an open mind, it is easier to choose the best software for the job you are facing. We knew that this show had high requirements on poly count, displacements, and epic volumetric simulations, things we know that Arnold is great at, so Arnold felt like a natural choice.

 

What other creative and pipeline tools do you use?

Jacobson: We use Shotgun as a base for project management and have customized it extensively to incorporate it into our workflow and pipeline. Maya is our main 3D software. We use Nuke for compositing, ZBrush and Mudbox for sculpting, and Mari for texturing and projection painting. For VFX work, we mostly use Houdini and a little bit of Maya. We keep a close eye on the market and try to regularly evaluate software that we find intriguing to stay on top of new software development. 

From the beginning, we have also embraced R&D and pipeline work as a crucial part of our philosophy to improve the way we do things. We value the process of how something has been made as much as the final result. We try to challenge conventions and are not afraid of putting resources and time into proprietary development if we cannot find the right commercial tools to solve obstacles we consider a problem. This has led to our creating a fair amount of proprietary software and tools for different reasons: our own framebuffer, Cyclone, a render manager, Monster, and a volumetric renderer, Tempest, just to name a few.

How did you render volumes, and where were they modeled/designed/look-dev'd?

Jacobson: We generated OpenVDB volumes from all simulations, either from Houdini or directly from Maya. Those volumes were imported into Maya using Arnold’s standard volume workflow, using the Arnold volume functionality. We set up the standard shading network using Arnold’s volume collector to define and manipulate density, temperature and heat on render time. Once the shader was set up so that all fields were used and rendered correctly we exported the Arnold volume to an ASS file.

This ASS file got imported into a ‘rigging’ scene containing a controller curve, which handled transformations and time offsets, as well as a rough geometry representation of the volume to help the lighting TDs during layout and timing of each shot.The Lighting TDs would then import this rig as a reference to allow their scenes to open without much overhead from the 100-plus volumes. This was helpful during layout process.

We used our own in-house ASS stand-in architecture to reference to the ASS volume containers. This had the big advantage of allowing for quick and easy overwrites on all ASS files.

The Lighting TDs had the control to change the shader of any volume and could simply copy/paste those changes as Python scripts to as many volumes as they wanted to. We created pre-sets, which enabled the Lighting TDs to tweak the densities of volumes to a specific percentage of what the FX team originally designed it for. The density presets made sure that all other channels would follow without breaking the look of the volume, such as blowing out the temperature fields and so forth. 

This process allowed the VFX team to concentrate on generating simulation elements, rather than on the placement and the shading of big environmental simulations. The lighting TDs had a fast and snappy scene and were given full control over the artistic look without having to worry too much about the technical aspects of rendering a shot that contained 100-plus volumes of varying resolution, size and density.


What was your rendering hardware configuration and average render time?

Jacobson: Most of our rendering is CPU-bound and we use super-micro twin-u servers, four separate machines in a 2U form factor, which gives us excellent compute density at a very attractive price. We currently have 67 workstations and 82 render nodes. Most of the workstations are Core i7 and Xeon E5-1620 with four cores and 24-64GB of RAM. The render nodes are a mix of 12-, 16- and 20-core machines with 48-128GB of RAM.

This setup covers our base needs, but during peaks we expand capacity with cloud nodes. Childhood’s End delivered at the same time as our work for Black Sails, season three. So during this time period we added 100 nodes of Amazon’s c3.8xlarge instances i.e. Xeon E5-2680 v2 with 60GB memory and 2 x 320GB SSD.

Our average frame times for final renders were about two to three  hours per frame. Some extensive environment shots with heavy volumetrics ran longer than that, due to multiple render passes to construct the final image. Arnold is very consistent when it comes to render time. We usually end up somewhere around those numbers on complex scenes.

Childhood’s End is available for streaming and/or purchase from SyFyNow, Hulu, Amazon Prime, Google Play, iTunes, Xbox and YouTube. Important Looking Pirates is one of Scandinavia’s largest VFX and digital animation studios. Founded in 2007 by Niklas Jacobson and Yafei Wu and based in Stockholm, ILP creates digital content for all forms of production including TV series, feature film, commercials, VR, web and games. 

IMAGE CREDITS: Courtesy of SyFy and Important Looking Pirates