Engine-unity
Kathleen Maher
Issue: Volume 39 Issue 4: (Jul/Aug 2016)

Engine-unity

Gaming has always had its practical side. The somewhat tortured sobriquet “serious games” has been used to describe non-game applications such as training tools, simulators, configurators, and those for VR and AR development.

The field has been a relatively small and specialized sub-segment of the gaming community. In general, it’s not a big moneymaking field compared to the crazy money that can be made (and lost) in commercial games. Frequently, organizations cobbled together their own game development tools, built from scratch using C++ or assembly code for performance, or used simple open-source game engines.

Game engines have been around for a long time, but they could be expensive and difficult to use, and good engines usually charged royalties. In fact, sometimes developers had to pay royalties for the individual components of game development, such as physics, real-time rendering, facial animation, and so forth.

FREE-FORM

The advantage of game engines is that they offer a united development environment and a consistent UI from game to game. Middleware is included or perhaps bolted on as needed. But most important, game engines provide real-time interaction and feedback, and with the steady march of Moore’s Law, performance just gets better and the rendering gets prettier.

Unity changed the game when it arrived on the scene in 2004. The company offers a limited personal version of its game engine for free, and even a full license is royalty-free and can be used for multiple applications. Such a plan encourages experimentation, and as a result, the Unity Engine has been downloaded millions of times for multiple purposes – the company claims four million registered users. And a look at the Unity site reveals a wide variety of non-gaming applications.

The opportunity has not been lost on other companies in game development, either. Two of the leading game engine developers – Crytek and Epic Games – have expanded the payment options for their software to include “free” and provide inviting options for individuals and small organizations. Amazon has acquired a license for CryEngine 3 and is making it available for free in order to build the ecosystem for its Amazon Web Services (AWS). Blender, an open-source provider, offers a free game engine, as well.

Likewise, Autodesk is targeting the application developer market with its Stingray game engine acquired through the purchase of Bitsquid in 2014. It, too, has no royalty entanglements and offers Stingray free to licensees of its Maya LT modeling and animation software. These are the more popular offerings at the tip of the game engine iceberg. And the crazy rise of Interest in VR and AR is also bringing in new developers with their own ideas about what can be done with a game engine.


CRYTEK’S CRYENGINE IS BEING USED TO CREATE SHORT FILMS AND VR EXPERIENCES SUCH AS (INSET, TOP TO BOTTOM) “SKY HARBOR,” “THE CLIMB,” “BACK TO DINOSAUR ISLAND” AND (LARGER IMAGE) “WARFACE ANUBIS.”

MOVIE MAGIC

In 2015, at the Game Developers Conference (GDC), Epic and Weta showed the results of their collaboration using assets from The Hobbit to create a VR application. There was not a lot of information published about it; Weta clamped down on the news even though it was a public presentation. Nevertheless, it was an important milestone because the teams were talking openly about the ways in which game engines can be used with content created for movies. In this case, the Epic and Weta teams were experimenting with VR content creation, but as it turns out, the collaboration goes the other way, as well. Movie companies are using game engine technology on set for live virtual filmmaking.

At the FMX conference held annually in Stuttgart, Germany, for the digital content creation industry, filmmaking engineers talked about their work in adapting game engines for use on set. For example, Digital Domain has developed the Photon tool using the Unity engine as a base. Photon development began with James Cameron’s Avatar and is ongoing.

At FMX, Rob Legato, visual effects supervisor for The Jungle Book, teamed up with Adam Valdez from The Motion Picture Company (MPC) to describe the use of Photon to allow Director Jon Favreau and his crew to work within the virtual set. Valdez described Photon as a wrapper for the game engine, which makes it more “filmic friendly” with additional lighting, composition, and photography control.

In addition, VR was used to get inside the set and walk around. The filmmakers could get a better understanding of how the digital set would feel and would enable more interactive lighting development and shot planning. The Unity game engine supports multiple players, so the filmmakers could work together in VR.

Favreau is a collaborative director and an actor who likes to get multiple points of view about how best to get shots and achieve the effects. As they worked on the virtual sets, Favreau was able to try different approaches and play individual parts; he also interacted with the movie’s star, Neel Sethi. For much of The Jungle Book, the filmmakers were only working with one live actor on a bluescreen set (see “Virtual Verite,” March/April 2016). His moves and interactions were mapped out and captured using the Photon system, and the cinematographers could monitor the capture within the digital set.


SHOWN HERE IS A STILL FROM “ADAM,” A REAL-TIME RENDERED SHORT FILM USING UNITY.

Crytek has been playing with similar ideas using its CryEngine beyond traditional gaming. Recently, it announced the spin-off of a new company (and product) called Film Engine, which will further develop CryEngine for these “other” applications. Filmmaking is the low-hanging fruit for the company.

CRYTEK BETS BIG ON FILM ENGINE

At FMX, Crytek introduced the company and demonstrated Film Engine against a greenscreen backdrop. The set was built in the worst possible location for a film stage, with huge windows and pillars on a mezzanine floor in Stuttgart’s lovely, albeit impractical, mid-19th century Haus der Wirtschaft, but they made it work despite these limitations.

“Real time is our privilege,” says Cevat Yerli, co-founder, CEO, and president of Crytek. By this, he means that the technology developed by the game industry has enabled real-time visualization. It was developed by necessity, but now it’s a gift.

Crytek has been working on film production tools since 2008 when the company first started developing Cinebox. Early demos of Cinebox show the tool being used to create cinematics, the short little stories that introduce games and lay out the premise of the story. Cinebox was used in the development of Crytek’s ambitious and cinematic game Ryse: Son of Rome, which relied heavily on performance capture.

As development continued, Cinebox was used to make movies – as its developers had hoped. It was used by Previs Director John Griffith on The Maze Runner (2014) and Dawn of the Planet of the Apes (2014). According to Yerli, he and his team began to realize that the technology they were developing could play a pivotal role in the film industry – not just for previs, but throughout production.

Jean-Colas Prunier is the creative director for Film Engine. He has been involved in the Cinebox project for some time and joined Crytek in 2014 as the idea for Film Engine took shape. Yerli, Prunier, and their team revamped Cinebox from the ground up to be a tool for the film industry. The product today has an interface that will be familiar to people using production software such as The Foundry’s Nuke or Autodesk’s Flame.

One of the big realizations for the team was the recognition that whatever they built had to conform to the pipeline tools already in use. As they developed Film Engine, the teams have made sure that it supports the core technologies developed by the film industry to grease the pipeline, including Alembic, Ptex, OpenEXR, OpenColorIO, and Collada.


AUTODESK IS TARGETING THE APPLICATION DEVELOPER MARKET WITH ITS STINGRAY GAME ENGINE, USED HERE FOR THIS ARCHITECTURAL RENDERING OF AN APARTMENT.

Yerli believes that game engines – which are just now coming into their own – hold the key to better production methods. Indeed, game engines are powerful for film production because they have been developed to provide real-time performance. They’re fast, adaptable, designed to work with motion capture, animation, film content, and programmatic content, with fast 3D rendering to make it all possible.

On that note, Yerli says Film Engine can work with any renderer. It accepts 3D models and animation from any industry tool. As for motion capture, well, it happens in real time. And, it’s all available in the engine. The real promise, though, of the Film Engine is that people are able to work with one tool for previs, production, and postproduction.

WALKING THROUGH

Yerli is not alone in his enthusiasm for the power of game engines. Autodesk says it is building on its Stingray game engine so it can do more. In fact, the company used technology from Stingray to beef up its Flame production tool. And, technology from Stingray was used to create Autodesk’s new Matchbox Camera FX, which enables in-camera post-processing effects, including ambient occlusion, reflections, and field-of-view adjustments that can be applied and adjusted in real time.

Autodesk is also promoting the use of Stingray in other fields, such as AEC, to develop walk-throughs, which let people interact with 3D designs. With a process the company is calling Live Design, users can experience and interact with BIM-accurate designs in real time through a Revit-to-3ds Max-to-Stingray workflow. In addition, the workings of mechanical designs can be demonstrated or tested with the help of Stingray.

NEW MEDIA


EPIC’S UE4 IS BEING USED FOR MANY PURPOSES, INCLUDING REAL-TIME FILMMAKING.

At this year’s GDC, Epic Games moved the ball further. In its opening keynote session at the conference, CEO Tim Sweeney talked about the larger vision he sees for Unreal’s engine technology, and showed work being done by Ninja Theory for its Hellblade franchise to make his point. Director Tameem Antoniades showed a gorgeous, and disturbing, scene featuring Hellblade’s Senua fighting her way through the hallucinations of a vision quest. The camera zooms in on her face and shows all the tiny and subtle moves that convey her emotions. Great content, but also the type of stuff the audience of GDC drinks in continuously throughout the conference. But then they pulled back the curtain and revealed the actress, Melina Juergens, in a mocap rig acting out the scene to feed the demo in real time.

“What the camera was to the 20th century, the engine is today,” Sweeney said. “The whole world is being re-envisioned and re-invented around real-time 3D. This is a revolution that is much bigger than the gaming industry.”

Sweeney told the audience that non-fiction uses of the Unreal Engine have increased “tenfold.” Like Unity, the company has a showcase on-line that illustrates the depth of the applications with ones being developed for AEC walk-throughs and custom configurators. In the Epic meeting rooms, the company presented virtual showrooms built by Rotor Studios in Sydney that let customers build their own Toyota.

THIS IS BIG, REALLY!

“Game engine” was probably never all that great of a name for the kinds of real-time interactive authoring tools that have evolved. Notice that Sweeney simply calls his technology “our engine.” Crytek calls its new tool a “film engine.” And, undeniably, “game engine” does describe the technology’s central role in the world’s most profitable content-creation industry. All that money has stimulated huge innovation.

If a tool is widely useful, it will be widely used, names be damned.

But, there’s something else going on here, and it’s been going on for a long time. Sweeney hints at it when he talks about the importance of the camera in the 20th century. We’re seeing the evolution of digital media that gives actors the ability to directly drive their digital character or, conversely, to not even be on set. Gravity is remarkable for the fact that it is primarily an animated film, but no one ever thinks of it that way.

And then there’s the real word, finally, that we are getting a chance to walk around in movies – we can even roam about on Star Wars’ Tatooine. Heck, we can walk around in a car engine, or sit in the cab of a tractor. The tools used to create content are helping change the content, and the fact that so many more people have access to the tools is accelerating that change.

We are seeing the emergence of new applications that aren’t games, but they are 3D, and they are immersive. They might tell stories, they might entertain, they might be educational, they might introduce new worlds to explore.

They’re kind of like real life… but digital.

Kathleen Maher (Kathleen@jonpeddie.com) is a contributing editor to CGW, a senior analyst at Jon Peddie Research, a Tiburon, CA-based consultancy specializing in graphics and multimedia, and editor in chief of JPR’s “TechWatch.”