Photorealism in Real Time
Issue: Volume 40 Issue 2: (Mar/Apr 2017)

Photorealism in Real Time

What happens when two facilities that are pushing technological boundaries join forces and take their cutting-edge ideas and innovations to a new level? Something truly amazing and groundbreaking.

Recently, Epic Games and The Mill in New York teamed up to revolutionize the conventions of digital filmmaking in an endeavor code-named Project Raven. Epic provided an advanced version of its Unreal Engine along with features still under development for real-time production. The Mill, an expert in VFX and production, used its just-released virtual production tool kit called Mill Cyclops, along with a technology--laden, fully adjustable vehicle rig called Mill Blackbird that captures high-quality environment data. Car manufacturer Chevrolet gave the collaboration additional traction with design information for two of its sports cars (a 2017 Camaro ZL1 and an FNR concept vehicle) and further functionality resulting in a new level of customer engagement.

Innovation occurred initially when lighting and other ever-changing environmental information was captured and processed instantaneously on set and then applied to two CG models, thereby seamlessly replacing the Blackbird as seen through the camera – all in real time.

Later, this same real-time technology was used to create, alter, and produce a short film, which uses augmented reality (AR) to highlight a new methodology of production. Called “The Human Race,” the film merges real-time photorealistic visual effects and live-action storytelling, as the two Chevrolet vehicles – one an autonomous futuristic car and the other a high-end sports car – zip over a winding mountainous road during a race between man and machine.

Neither car is real in the live-action scenes; both are computer-generated. What’s special here is that they were not added in post. Rather, they were rendered at photoreal quality in real time and composited seamlessly into the scene, utilizing the lighting information captured by the Blackbird, the only physical vehicle at the location aside from the film car carrying the director, DP, and others.

The film will be used to kick off a multi-platform campaign marking the 50th anniversary of the Camaro. First, though, a live re-creation of the AR visualization was shown during Epic’s presentation at the Game Developers Conference in March, showcasing the real-time technologies used for the project.

“I think this is the first time that there has been a real-time live-action film done with real-time visual effects in it,” says Angus Kneale, COO at The Mill. “With Unreal’s cutting-edge game-engine technology, filmmakers are able to see their photoreal digital assets on location in real time. It also means the audience can effect change in films in ways previously unimagined, giving interactive control over vehicles, characters, and environments within a live-action cinema experience. With Mill Cyclops, we are able to render and integrate digital assets into the real world to a level never seen before. Cyclops takes the reflections and image-based lighting, and lights the vehicle in real time while the IBL changes for every frame, corresponding to the background plate. That is revolutionary.”

Joining Forces

The Mill has a small but fast-growing real--time group in New York that uses Epic’s Unreal Engine (UE) for its projects. Early last year, Epic representatives, including CTO Kim Libreri, visited the studio, where they were shown the Blackbird. “It blew us away, seeing the novel approach to how car commercials could be done,” says John Jack, special projects producer at Epic.

The two groups floated the idea of a joint venture utilizing the Blackbird, Unreal Engine, and Mill Cyclops, a virtual production tool kit that was born in 2014 as an on-set mobile VR app and has since been incorporated into Unreal. “That was the genesis,” says Jack. “We then got to the point where we thought we could produce an entire commercial with an AR component, then an entire short film, all in UE, taking the traditional post process out and doing the whole thing in real time.”

The Mill, which had worked on previous projects with Chevrolet, pitched the idea to the car company’s agency, Commonwealth, with whom they had a prior relationship. Both the client and agency were in. They, like Epic and The Mill, were looking beyond the obvious as to what the project could lead to.

“ ‘Finding New Roads’ is more than just a tagline at Chevrolet. We embrace that mission in the advanced engineering of our cars and also in the way we service and communicate with our customers,” says Sam Russell, general director of global Chevrolet marketing. “The technology involved in producing this film provides a glimpse into the future of customer engagement and could play a unique role in how we showcase car model options with interactive technologies like AR and VR.”

Preproduction Road Trip

The Mill was the driving force behind the project, providing the creative, direction, final edit, and so forth, as its real-time team and 3D artists worked hand in hand along with Epic’s UE team. Filming occurred “from first light to last light” on December 20, 2016, at Angels Pass in California. There were two practical vehicles on location that day: the Blackbird and the camera car.

Footage was then shot from the traditional camera car that was outfitted with some non-traditional equipment. Additional aerial footage was filmed from a drone, although only a few of those shots were used in the final film – two overhead shots and those showing the entrance and exit to the tunnel. In the film, the terrain, action, road, sky, trees, mountain, and tunnel during approach were all live action, with the exception of the two Chevy vehicles. Once inside the tunnel, everything became CG.

Meanwhile, 3D artists at Epic built the two vehicles that would star in the film: the Camaro ZL1, an unreleased production car at the time (limited in number and availability), and the even more rare FNR, a one-of-a-kind concept car. Unlike the vehicles themselves, this type of scenario is not uncommon for car commercials, which are produced well in advance of model release, resulting in a vehicle being unavailable for filming or the manufacturer being unwilling to risk having the design leaked by having the car in public.

In the past, this meant generating high-quality CG cars, which could take days to render, and adding them to the live-action backplates in post. For this project, Chevrolet provided CAD data for both vehicles, which Epic artists had to optimize to run in real time within the scenes in camera on location during the shoot.

The shoot was one element of the project. The second was production of the short film, completed at the end of February. All that work was done within the engine, as well. The Mill artists had to set aside many of the traditional VFX software they typically use for a commercial or film and instead approach things in a different way using the game engine (see “The Human Race,” page 22).

“We approached things with a clean slate, so we were not constrained by any previous workflow or processes,” says Kneale. “Everything we did was new. We had to learn new tool sets and new ways of working.”

As Jack notes, the demo material was shot at 24 fps, which was also the target frame rate for playback. “We had close to 2gb/sec of data flowing into the engine, and that includes EXR live-action backgrounds, 3k HDR lat-longs for lighting and reflections, 2D elements like mattes, and photographic elements like lens flares,” he says. “Most shots had three EXR elements streaming in at real time.”

The Technology

As Kneale notes, all the different technologies used for Project Raven came from other projects over the years, serving as building blocks to get to this milestone.

The Blackbird, which The Mill built and launched last year, gave wings to the project. It is the first fully adjustable car rig whose chassis can be quickly transformed to match the exact length and width of almost any car. In addition, its electronic engine and adjustable suspension can mimic the performance characteristics of any vehicle. A camera array and stabilization unit enables it to capture high-quality footage as it maneuvers through a physical environment. Meanwhile, HDRI and 3D laser scanning are utilized to generate a virtual version of the environment; these are then used to produce photorealistic CG renders later down the line in real time.

“It was conceived out of the repeated problems we kept seeing on auto advertising projects,” says Kneale. “When revisions came out, rather than shooting a car all over again, the advertiser would ask us to use an older model and make minor changes in CG to the grille or headlights, for instance, to make it look like the new version.” Over time, The Mill found itself augmenting larger and larger car parts.

The Blackbird aids the group in this process by capturing lighting/reflection data from the physical environment so it can be applied to the CG imagery, elevating the reality factor a few notches and making it appear as if the digital model had been truly filmed on location.

Photorealism in Realtime
THESE DIGITAL CARS WERE INTEGRATED IN REAL TIME INTO THE LIVE ENVIRONMENT FILMED ON SET.

“We knew that there was enough information from the Blackbird to render a photorealistic car. We just didn’t know if we could do it in real time,” says Kneale. “But Epic was as invested in the project as we were, or even more so because they wanted to show this off at GDC. There was no other option than to pull it off.”

The real-time capability fell to Epic, as the company implemented features and functions within UE that would support Blackbird’s image capture and the filmmaking process following the shoot.

“There were a lot of items that were on our long-term road map, but we used this project to accelerate some of them, such as streaming video feeds and EXR sequences for background plates in live action, as well as dynamic image-based lighting in spherical environments,” says Jack. “At every point, in every frame along the Blackbird’s path, you get a complete spherical lighting environment, so the lighting environment is not static, but is always changing as the Blackbird moves.” (See “New UE Technology” on page 22.)

These functions were incorporated into an R&D version of the engine. Initially the group began with a pre-release version of UE 4 but had to branch off at a certain point so 4.15 could be released at GDC this past March. Many of the Project Raven experimental features are expected to roll out within the next year.

“The missing component had been the ability to visualize the vehicle in an environment in real time,” says Kneale. “That is when Cyclops became so useful.”

The real-time developments were vital in the development of Mill Cyclops, which enables artists to render photorealistic digital assets and integrate them into live-action footage instantly – enabling directors or creatives to work with finished-quality photoreal digital assets live on location or in augmented-reality applications.

“It’s been bubbling around for a couple of years now, but we only launched it last month after we got the performance improvements we needed in UE,” says Kneale.

In a nutshell, Cyclops enables users to ingest multiple video streams, so they can use this information to light, reflect, and integrate CG objects into real-time video feeds. As Kneale explains, it utilizes different tracking technologies to figure out what the camera is doing in the scene so the CG object can live within that scene in real time, and be tracked to that world in real time.

“The intention was for more general usage – to use Cyclops on location so filmmakers could see the VFX in real time and make critical decisions accordingly,” Kneale says. “It is useful at the beginning of the supply chain to catch all the information you will need so when you arrive at the end, you can affect all these different things with the information being captured by Cyclops.”

Photorealism in Realtime
AMONG ITS ONBOARD TECHNOLOGY, THE BLACKBIRD HAS A CAMERA ARRAY THAT CAPTURES FOOTAGE AS IT MOVES IN THE ENVIRONMENT.

In addition, Cyclops offers viewers a new way to experience traditional film and AR content, giving them control over objects, characters, and environments.

Putting the Tech to Work

On location, the Blackbird maneuvered around the environment, filmed by the camera car. The Blackbird captured the environment at 48 fps with four 6k Red Dragon cameras (using super-wide-angle lenses) mounted on the top with a special stabilization unit by Performance Filmworks. That imagery was blended together in real time with Mill Stitch, a real-time stitching algorithm, running on a computer inside the Blackbird. The result is a full 360-degree high-res sphere of footage used to light the CG.

The Blackbird was tracked by an Arraiy system, and the information was also fed into Unreal on the workstation, providing the vehicle’s 3D positioning relative to the camera.
All this data was then output via a radio transmitter at 1080p to the camera car, also designed by Performance Filmworks. The camera car was jam-packed with a driver, crane operator, cameraperson, director, DP, and real-time supervisor, along with four monitors and a workstation running Cyclops.

“At that point in Unreal, you have the background, the Blackbird positioning, and the real-time lighting environment, all rendered in real time,” says Jack. The Blackbird was then replaced with realistic, nuanced CG vehicles and output to the director and cameraperson, enabling them to make adjustments during the shooting while the action was unfolding. Meanwhile, everyone else on set watched the Blackbird moving down the road.

The Project Raven workflow calls for the one-to-one replacement of the physical Blackbird with a CG car. But for this project, the group decided to add another CG vehicle to support the film’s story line. As Kneale explains, because the two CG cars are in close proximity in the scenes, the same image-based lighting captured on set could be applied; the vehicles also have the same physical specs that had been programmed into the Blackbird. The key was that the Blackbird was always in front of the camera car. “Sometimes the Blackbird is the Camaro and sometimes it is the FNR,” he says.

Photorealism in Realtime
IT'S MAN VERSUS MACHINE IN THIS ALL-CG SCENE FROM THE SHORT FILM "THE HUMAN RACE."

The same workflow and concept can be applied for just about any object, not just cars. “We just happened to have a Blackbird, so we used cars. You can use it for any object you put into a scene, any character you want to visualize on set in real time or render in real time,” says Kneale. “At the end of the day, the technology can be used for a multitude of different scales and types of objects.”

Glimpse at the Future

According to Jack, one of the goals was to push the fidelity of what is possible in terms of realistic lighting and realistic CG. “The idea was to get to the point where we, along with The Mill, could deliver something that was high fidelity enough where people didn’t question the use of real-time technology for rendering,” says Jack. “We think we hit that mark with ‘The Human Race.’ ”

Photorealism in Realtime
CHEVROLET DEVELOPED A CAR CONFIGURATOR BASED ON SUPPORT FOR GOOGLE TANGO, RESULTING IN A PERSONALIZED CAR EXPERIENCE.

This takes virtual film production to a whole new level. As Jack explains, they are not replacing a real-time character that “just looks pretty good.” They are using actual lighting conditions from the environment. “It’s more of a precursor to AR than to virtual production. These are things that absolutely will be required for AR if you are going to add CG into the real world around you.”

Photorealism in Realtime

'The Human Race'

In the race between a human driver and autonomous futuristic vehicle, who will win? In “The Human Race,” a real-time live-action short film with VFX, the prize goes to The Mill and Epic Games for the groundbreaking production. But looking down the road, it will be the filmmakers and viewers who will benefit from the technology that merges live-action storytelling and real-time visual effects.

Creating the short required new tools to support the inventive workflow, which incorporates traditional processes with new approaches. The Mill artists had received pre-tessellated, high-poly meshes of the car models (the 2017 Camaro ZL1 and the FNR concept car) from GM. These meshes then went through several rounds of decimation and re-topologizing until the crew found the best balance of quality and resolution for ingest into the Unreal Engine. 

“What we ended up with is two versions of the car: a high-quality film version, ready for close-up shots, and a heavily optimized version (under one million polygons) for use with Mill Cyclops,” says Joji Tsuruga, real-time supervisor at The Mill.

Once edits were completed, the team used Science-D-Vision’s 3DEqualizer and The Pixel Farm’s PFTrack to handle both the fine tracking of the camera and objects. Along with tracking data imports, all modeling, rigging, and animation tasks were done in Autodesk’s Maya. Texturing was done using Allegorithmic’s Substance Painter. The Foundry’s Nuke was used for plate cleanup, 360 lat-long stitching, camera post moves, lens distortion maps, and creating additional layers. 

Finally, UE was utilized to combine exports from all of the above for shot setup, editing, lighting, compositing, and rendering in real time.

A combination of Lidar scans and photogrammetry were utilized to re-create parts of the shoot location, which was essential for shot layout and accurate proportions. “We were also given access to a prototype of UE4’s new particle system, Niagara, which allowed us to create incredible effects that were simply not possible with their previous system,” says Tsuruga.

Without question, creating the film required a unique approach and was a novel endeavor. “This was truly a pivotal project to be a part of because it gave us a glimpse into what current workflows will evolve into over the next few years,” says Tsuruga.

While the scene-capture technology is amazing, the truly groundbreaking aspect is the postproduction that occurred in real time. “ ‘The Human Race’ blends cinematic storytelling and real-time visual effects to define a new era of narrative possibilities,” says Kneale. “This is a pivotal moment for film VFX and the coming era of augmented-reality production.”

Let’s not overlook one more aspect: giving a customer the ability to configure a car and place it into a piece of curated content. “People can change a car online, but we are showing a new level of marketing engagement,” says Jack.

To this end, Epic worked closely with Google’s Instant Preview development team to support Google Tango, an AR computing platform, within the engine. Using this functionality, Chevrolet demonstrated a car configurator that provides customers with a virtual camera portal into the configurator’s world. Similar to virtual production technology, customers can see the CG car, change attributes, and experience the personalized vehicle more intimately.

Meanwhile, Epic will be including much of the new technology resulting from Project Raven into future game engine releases. The Mill is looking at applying the technology to commercial productions.

“We are always looking for new ways to tell stories and help directors get their vision visualized. This will help them create and support stories in new ways we never thought of before,” says Kneale.

And there’s no doubt that the imaginations of many are running wild at what Project Raven has now made possible.

New UE Technology

  • Multiple image streams of uncompressed EXR images (around 1.8gb/sec)
  • Dynamic Skylight IBL (image-based lighting) for lighting the cars
  • Multi-element compositing, with edge wrap, separate BG and FG motion blur
  • Fast Fourier Transform (FFT) blooms for extra bling
  • PCSS shadows with directional blur settings
  • Bent Normal AO with reflection occlusion
  • Prototype for Niagara, a next-gen, node-based VFX tool
  • Compatibility with Nvidia Quadro graphics card
  • Support for Google Tango-enabled devices (currently Lenovo Phab 2 Pro)

Karen Moltenbrey is the chief editor of CGW.