Game Engines Power Content Creation
Kathleen Maher
Issue: Summer 2019

Game Engines Power Content Creation

By Kathleen Maher

There was a time when game engines were built specifically for certain games, and developers believed the quality of their game engines defined the quality of their technologies. But as the game industry has matured, with thousands and thousands of games developed and played, and franchises established, game developers have come to realize that their time may be better spent creating  characters, fun playgrounds for their players, and compelling story lines.

As a result, game engines have become stable resources for developers, with Unity's engine, Epic's Unreal, and Crytek's CryEngine leading the pack. Sure, people still build game engines, but there are a wealth of other options for ready-made engines, plus using an established leader can eliminate some risk in an inherently risky business.

Game developers have gone through many trials by fire as they struggle to meet deadlines and budgets, even as demand for new games increases. Many companies fail, and those that don't have to find better ways to get the job done. The film and TV industries have had similar challenges. Demand for visual effects have driven up budgets, sometimes with little benefit to the quality of the finished product, and increasing demand for content to satisfy streaming audiences is challenging traditional pipelines.

So, what are the characteristics of a game engine that make them useful for content creation? One, a game engine is a central repository for all the assets needed to create a game/application. Thus, it supports a wide variety of DCC file formats for models, textures, lighting, animation, and so forth. Two, it's real time, meaning ideas and changes can be applied dynamically to see how they look. Rendering is instantaneous, so mistakes are immediately obvious - and serendipity comes for free. And three, Unity and Unreal both have large communities of active developers who submit tools to the companies' asset stores for specific applications.

Game Engines
GAME ENGINES ARE PLAYING A ROLE IN VIRTUAL FILM PRODUCTION, AS WAS THE CASE FOR THE JUNGLE BOOK AND WELCOME TO MARWEN (TOP).

Seeing Through the Camera

In a recent industry trade article, Rob Bredow, senior VP, executive creative director, and head of ILM, talked about the work ILM did for the Steven Spielberg film A.I. (released in 2001). He said a game engine was used to feed images to the greenscreen, and with the studio's tracking system, was able to give actors a reference point within Rogue City, the movie's futuristic take on Las Vegas as the land of temptation. At that time, the engine was repurposed for Unreal Tournament. ILM has since created its own engine for bringing the digital components into the real world of film production.

Now, much of the work game engines are doing in this realm is in conjunction with virtual cameras - tools that add a monitor, often an iPad, that provides a look into the game engine at work.

At FMX 2019, cinematographer Matt Workman revealed the work he has been doing to develop Cine Tracer in Unreal to enable filmmakers to block out scenes, try out lighting, and quickly build rough sets. He's using Cine Tracer to work on films with directors and DPs. It functions like a video game and, in fact, is available through the Steam store as an Early Access game. It includes storyboarding features as well as real-time scouting.

Game Engines
READY PLAYER ONE USED VIRTUAL PRODUCTION TOOLS TO CREATE VIRTUAL PROPS, SETS, AND SO ON THAT RAN IN A MODIFIED UNITY.

As he develops the program, Workman is adding real-world cameras and lighting. It's important to stick with the language of cinematographers and use actual focal lengths, he says, because "a DP will think something is bogus if a strange focal length is offered."

Essentially, Workman has made an application to streamline previs and production camera work, and he's shortcutting a lot of the DIY work that went into early efforts of using game engines for previs. Significantly, his tool, and others like it, are also helping bridge the work of cinematographers, visualization, and VFX.

Some of the same synergy happened during the making of The Jungle Book. MPC has been on the front lines of virtual camera use and worked with Unity to adapt the engine for film use, giving director Jon Favreau the ability to interact with real-life and digital characters on the set.

Unity Creative Director Adam Myhill, who joined the game developer shortly after Jungle Book was released, developed Cine-machine to emulate real-world cameras and lighting within the Unity game engine. Unity acquired Cinemachine and further developed Timeline, which adds a timeline and provides interactive creation within the game engine. Girish Balakrishnan was lead technical director on Jungle Book at Digital Domain (prior to moving to MPC), which was responsible for the virtual production. 

Myhill, too, has a cinematographer's sensibility. His work has been to bring an understanding of the visual language of film to game engines, and cinematic literacy for game engines makes them more useful to the people making movies. It's a two-way street. Cinematic literacy also makes for more compelling game development and better animatics, the filmic sequences used to introduce game story lines and provide transitions.

For film, the tools have to move, record, and present content in the same way that actual cameras do, Myhill points out. Audiences, he says, have been immersed in 100 years of cinematic language. People can feel it when it's not right, and that goes for those on the set as well as in the audience.

Director Favreau has committed once again to virtual production for his current film, The Lion King (see page 4), for which MPC and Magnopus collaborated on the virtual production - MPC built all the real-time assets (characters, environments, animation) for use in Unity and VFX; Magnopus built the multiplayer VR tools, including  an AR/VR interface that enables the director, cinematographer, production designer, and visual effects supervisor to work together.

Unity has been buying the technology that helps bring the game engine into production pipelines. Last year, the firm bought Digital Monarch Media, a company founded by virtual production developers Habib Zargarpour and Wes Potter. They developed the system used on Jungle Book, as well as Ready Player One and Blade Runner 2049. The two have also worked on games, including Ryse, Son of Rome, and Need for Speed.

Over a period of 10 years, Potter and Zargarpour have been building and refining their virtual production system, which includes Expozure, a virtual cinematography environment built on Unity and Cyclopz for a virtual camera interface using mobile devices, including iPads and mobile phones. The system has lighting and lens tools that reflect real-world equipment.

"We don't see it as a box product," says Potter, "every single film is different."

For instance, while working on Ready Player One, Potter was able to build a pipeline while sitting right next to Spielberg, enabling him to do a shot in a different way than previously planned. The power of game engines, adds Zargarpour, is that "you'll be able to get finished-quality visual effects in real time."

Zargarpour and Potter contend that the people working on the film loved the tools because it made creating visual effects more like playing a video game.

Game Engines
ED FILMS IS USING GAME ENGINES TO BUILD SCENES. FOR GIANT BEAR, UNITY WAS USED FOR VISUALIZATION.

New Player Up

Glassbox is a start-up founded by people from Foundry and Norman Wang, who founded Opaque Media. They're trying to make virtual production as easy as capturing movies on an iPad, but powerful enough for professional production. To do so, they teamed with The Third Floor, a virtual production company that has worked on almost every major movie using virtual production technology. Founded in 2004 by a team at Lucasfilm who had worked on Star Wars: Episode III - Revenge of the Sith, Third Floor is known for its work on James Cameron's Avatar, which established performance capture as a medium distinct from animation with vocal performances.

Glassbox is debuting its first product, DragonFly, which works with Autodesk's Maya, Unity, and Unreal. It's a low-cost system that will sell for $750, including support and updates for a year. They do see their product as one that comes out of the box and enables small teams to use virtual cameras for filmmaking and VR development.

Game Engines
ON THE SET DURING PRODUCTION OF WELCOME TO MARWEN. (PHOTO COURTESY MPC.)

The company is also developing BeeHive - software that will enable virtual scene syncing, editing, and review - with delivery expected soon after DragonFly. The systems will support multiple users and multiple platforms.

DragonFly is being used by VR Playhouse and partner company Strange Charm, which has been working on several projects including a digital human piece for Unreal at SIGGRAPH 2019. The system is also in use by the Cameron Pace Group for a Chinese production of Ghengis Khan.

Virtual camera systems like Cine Tracer, and tools from Digital Monarch and Glassbox, are designed to take quick snapshots to provide instant storyboards. They can block out a scene with camera movements, and increasingly they're moving into actual production, as we're seeing in the work Favreau is doing with MPC. 

Out the Other End

The most significant effect of game engines in filmmaking is that they are helping to bring disparate groups together. In a recent podcast, Unreal CTO Kim Libreri talks about how VFX has become separated from the rest of the production team, and "a little bit of the fun has gone out of production." Libreri, who famously worked on The Matrix, Catwoman, and Super 8, played a role in that evolution but now he says he misses the spontaneity and sense of discovery that has gone out of the process. The ability to work with game engines on set is bringing it back, and as the rendering capabilities of game engines have become more sophisticated, people can see exactly how the work is going to look in the final render.

Like Unity, Unreal has been adapting its engine to filmmaking paradigms. The new Sequencer Editor feature is a multitrack cinematic editing tool that works within Unreal. It allows content creators to add different camera tracks and cycle between them to find the right looks and pacing.

Game Engines
STEVEN SPIELBERG ON SOLO 2 SET.

Unreal has tackled characters in a big way using the virtual camera to drive performances and even give actors a role in the visual effects.Welcome to Marwen is director Robert Zemeckis's latest sacrifice on the altar of cinema-tech advancement: a strange, mostly true, story about Mark Hogencamp, a veteran who built a toy town in his backyard as part of his recovery from a crippling hate crime. To re-create Hogencamp's dream world, Zemeckis worked with longtime collaborator VFX supervisor Kevin Baillie of Method Studios.

Baillie talked about his work on the film at FMX 2019. Zemeckis, who was excoriated for his groundbreaking 3D creation of a digital Tom Hanks character for Polar Express, which strayed too far into the Uncanny Valley, had no intention of going that route to re-create the toy characters Hogencamp uses to play out heroic fantasies in Marwen. Instead, they embraced the limitations of current 3D work and scanned the actors to create 3D printed dolls, which were in turn scanned back to digital to create the doll characters to be driven by the actors. The actors' performances were captured on a mocap set with lighting identical to that of the Marwen toy town so the eyes and mouths of the actors could be composited and blended onto the doll characters, enabling the soul of the actors to shine through.

Baillie says the magic came as the actors could see their movements driving the dolls. He said the actors were eager to check the performance playbacks to see how their dolls did in the scenes.

Alas, the film did not do well at the box office. It's an odd story, and Zemeckis has some kind of drive to push technology to see what it can do, even if it doesn't do much for his bottom line. In this case, according to Baillie, the film demonstrated the power of bringing visual effects and actors together in the process. And again, Zemeckis has pushed the technology of motion capture forward.

Animating It All

In 2019, animation is a major front line for advancing game engine use in film and TV.

ED Films, a Canadian film company, is upending film production for small teams. It's creating a wealth of content and using game engines to build scenes and dynamically try out different looks, as well as reuse models, characters, and sets. For its film Giant Bear, which was first shown at the GLAS animation festival this year, the team did most of the visualization in Unity. They created Scene Track as a tool to export content from the game engine to Maya for further work. The film is lovely, intense, and explosive.

Game Engines
THE DRAGONFLY SYSTEM BY GLASSBOX IS DESIGNED FOR THOSE ON SET BUT DOESN’T REQUIRE KNOWLEDGE OF HOST SOFTWARE.

The company's president, Emily Paige, said at a panel presentation at FMX that the ability to work in real time and see rendered content in context was invaluable to getting the work done within the team's tight budgets in terms of time and money, but she also notes there is a lot of work still to be done to make game engines suitable for artists and filmmaking. For instance, Giant Bear involved fur and clothing, and the filmmakers were going for a painterly look that wasn't supported by the Unity engine at the time. That's why they also had to depend on Maya. ED Films additionally used Maya as well as Adobe After Effects to render the film traditionally.

ED Films has developed a clever bootstrap method. It sells tools like Scene Track as well as assets and training to help fund the studio's ongoing creative work.

The role of the game engine has grown dramatically in the animation pipeline during the last few years, and specific problems are being addressed. Disney Television Animation's R&D group invested in the development of Baymax Dreams with Unity.

Game Engines
ED FILMS USED A GAME ENGINE COMBINED WITH MAYA TO ACHIEVE A PAINTERLY LOOK FOR THE FUR IN GIANT BEAR.

Director Simon J. Smith (Penguins of Madagascar, Bee Movie) used virtual reality and Unity to explore storytelling and development, but also to spontaneously explore different lighting and different camera shots within moments. As a result, they were making significant changes days before the finished product was due. In this case, they were able to use Unity Engine as a finishing tool.

To be fair, Baymax Dreams are shorts, and the whole Big Hero franchise is a celebration of simple, expressive animation and fun. But the advantages of non-linear approaches and collapsing the walls between isolated specialists is changing the way content is made at a time when there is huge demand for content online and on television.

As new ways to make content take hold, the content changes. The mania for huge effects movies has not only taken some of the magic and spontaneity out of the process, but that loss is often reflected in the movies. It's no wonder real-time approaches are getting support and investment from the film and video communities. Real-time rendering is out there as a goal, but the real win seems to be coming in bringing creative people together. 

Kathleen Maher (Kathleen@jonpeddie.com) is a contributing editor to CGW, a senior analyst at Jon Peddie Research, a Tiburon, CA-based consultancy specializing in graphics and multimedia, and editor in chief of JPR's "TechWatch."