For today’s most creative digital filmmakers, pushing the technological envelope is part of doing business.
Case in point: Dawnrunner Productions, a boutique production house in the San Francisco Bay Area known for creating festival-worthy films, award-winning awards-show motion graphics, and innovative commercials, corporate videos, and music videos. Dawnrunner also has a core mission of reinvigorating and revitalizing the film industry in the Bay Area.
Each of the past four years, Dawnrunner has been tapped to create the prestigious big-screen 3D full-motion graphics “eye candy” that sets the mood and provides the backdrop and flow for awards shows held at the Game Developers Conference (GDC), including the Developers Choice Awards and the Independent Game Festival (IGF) Awards. These high-profile, high-energy events, honoring the best of the best in the video game industry, require video graphics that are as creatively and technologically impressive as the award nominees’ exceptional works.
“Every year for the GDC awards shows, we push our boundaries to the limit, doing things that are more and more complicated and robust,” said Dawnrunner VP of technology Geoff Peck, who is also responsible for the studio’s technical and visual effects. On the eve of the 2012 shows, Dawnrunner was thrown a curve that threatened to derail its entire production – but they came out on track, thanks to the NVIDIA Maximus platform.
Challenge
For the live 2012 GDC award shows, Dawnrunner’s 3D motion graphics – which included clips from nominated games – were to be displayed on a 32-inch diameter circular screen made up of LED lights. Dawnrunner expressed this year’s theme of Liquid Metal as a Pink Floyd-inspired graphic environment depicting a post-apocalyptic landscape, with lightning and bright flashes, and where all the water had turned to liquid metal. To create its 3D video graphics, Dawnrunner spent a full month working in a linear, sequential workflow among three primary software applications: creating and rendering 3D models in Autodesk 3d Studio Max; using Adobe After Effects (Adobe Creative Suite 5.5 version) for digital work and compositing; and editing everything together (including trailers from nominated games and other video clips) with Adobe Premiere Pro (also part of CS 5.5) video editing software.
“We had built a large render farm to support our efforts, and we relied on brute processing force to create the very complex GDC event video,” said Peck. Even with the render farm, making a change to already rendered material meant reworking a scene, hitting a button, waiting three hours to see the newly rendered result – and realizing that if the changes were wrong, there was no time to go back and do it again.
In the past, Dawnrunner structured the two hours of 3D graphical action in short segments that were looped together with transitions between them. “This year, UBM TechWeb – producer of the GDC ceremonies and our client for this project – asked us to pre-render everything into one big loop,” said James Fox, owner of Dawnrunner and the director and creative force behind everything the studio produces. “That meant a very render-intensive process that resulted in essentially one gigantic two-hour file.” The night before the big awards ceremonies, Dawnrunner had its first opportunity to run its rendered video graphics through the on-site projector.
“To our horror, we saw that a weird technical anomaly – a format mismatch – was causing a divergence of our images, so everything was projected as a bit too big for the 32-inch circular screen,” said Peck. “It was a purely technical issue that meant we would have to re-render everything on-site.” In addition, UBM TechWeb representatives realized that the font they had approved for text throughout Dawnrunner’s video graphics did not look right when the projected letters were eight feet tall. All the text would have to be redone, as well.
“Switching the font for a fully rendered two-hour video presentation is nothing like switching the font on a PowerPoint slide,” said Peck. “At 5 pm the night before the show, we faced a real nightmare. It was mathematically impossible for us to re-render the entire show in time using our existing equipment. We started calculating priorities, deciding what we would have time to do and what would have to be sacrificed.”
Solution
“When we heard about the last-minute changes that would be needed, our first response was to panic,” said Fox. “We thought we’d have to tear down our render farm, pack it up, bring it backstage at the GDC show, set it back up, get everything cued up, and just try to re-render as much as possible.”
Then Geoff Peck thought of something. A couple of weeks earlier, Dawnrunner had received a new NVIDIA Maximus system that the crew had only begun testing on a Dell Precision T7500 workstation. The Maximus system represents a colossal step forward for digital media production, enabling a single general-purpose workstation to not only render complex scenes immediately – thanks to the NVIDIA Quadro GPU – but also to harness the parallel-computing power of the integrated NVIDIA Tesla C2075 companion processor to perform multiple compute-intensive processes simultaneously.
Peck and Fox had brought their new Maximus system with them to the GDC event, “to play with and do some benchmarking whenever we had free time during the show,” said Peck. Geoff suggested they do the changes on the Maximus system backstage. Fox agreed: ”We were just familiar enough with the NVIDIA system to know what it could do, and we really had no choice but to give it a try.” They set up the Maximus system so they could balance the processing load among their three software applications: 3d Studio Max and After Effects splitting the Quadra GPU power, with Premiere Pro using the Tesla card primarily.
Impact
“I cannot describe in words how excited we are about what the Maximus platform can do, and how grateful we are that we had it with us that evening,” said Fox. “Everything worked seamlessly, there were no bugs or glitches, and the software processing loads were perfectly balanced. And amazingly, we were suddenly able to work in real time.” Peck set up the Maximus system backstage and started making the required changes, which were played directly into the production projection system. The results were visible almost instantaneously on the actual event screen. He was able to use 3d Studio Max with NVIDIA iray to change composition and framing where needed, render in real time, pass those results to After Effects and start that rendering, and immediately return to 3d Studio Max to start the next change in the next part of the show. He also was editing in Premiere Pro at the same time. And everything was happening all at once.
“Because of the real-time rendering, we knew immediately that the changes we were making were the right ones, and that we wouldn’t have to go back and fix them again,” said Peck. “In a matter of minutes, we re-rendered what would have taken us hours to do using our original pipeline.” With that original pipeline, Peck estimated that accomplishing all the changes required would have taken two or three days, going back and forth between making changes, waiting for them to render, checking them, and tweaking things again. To meet the live show’s deadline with the original pipeline, they would have had to choose which changes to make, and even then they would not have had time to test them.
The Dawnrunner team completed the changes, and refined and tested them, with enough time to spare that Peck – who in years past ended up spending the night before GDC events camped out on site to do final tweaks before the show went live – got a good night’s sleep. “There were 10 times more changes this year than in the past, and yet this year was orders of magnitude easier thanks to the Maximus system,” he said. Because everything was tweaked to perfection and tested way ahead of schedule, the Dawnrunner crew could relax and enjoy the actual show.
“For the first time, we didn’t watch the finished production with nail-bitten fingers crossed, hoping that nothing had been missed and that everything would go smoothly during the live event,” said Peck.
Another crucial point was that Dawnrunner’s client, UBM TechWeb, was also able to see the changes as they were being made and confirm that everything was done correctly. “That really put them at ease, and the confidence of that experience will help us secure future engagements with them,” said Fox.
“The overall success of the show is leading to business connections from other companies, as well. We expect our new nimbleness to translate into enough additional work for Dawnrunner to cover the cost of the Maximus system 10 times over.” Experiencing the Maximus system and its capabilities during such a desperate moment has convinced the Dawnrunner team that this technology will be an invaluable asset in the future.
“We are filmmakers first and foremost, and we always want our stories to come to life,” said Fox. “In 99.98 percent of the projects we’ve done in the past, we were forced to make severe creative sacrifices because of our rendering limitations. Small boutique studios like ours can’t afford to make creative sacrifices. We believe the Maximus system tips things in our favor, creatively and competitively. “One of our root goals is to help expand the Bay Area film industry in every way possible,” he said. “With Maximus systems at Dawnrunner and in the hands of a couple of other boutique studios, you’ll be amazed at the quality of work that will start coming out of the Bay Area.”