Virtual production has recently begun cementing itself in the creation of films. The real-time technique strives to enhance the pre-visualization process, especially in the world of complex visual effects. However, more recently, there has been a rise in its use on set, during production.
We have previously explored the different uses of virtual production, both in pre-vis and production. Since then, the technique has advanced much further. Whereas before, virtual production was being used to create the more complex VFX scenes, it’s now being used as the crux of entire productions, thus paving the way for a more immersive way of filmmaking.
With virtual production, filmmakers and creatives alike can create digital environments before going onto a physical set—in the pre-vis stage. Then, using VR, filmmakers can access these environments on set. All they need to do is put a headset on and they are immersed in the world they created.
This is a game-changer for VFX-heavy films, as it allows directors to see the action in context to the rest of the film. They’re not relying on guesswork; rather, they’re able to edit and reassess on the day of shooting, allowing them to create the best visuals possible for their film.
King of the Jungle
Jon Favreau is one of the few directors experimenting with the use of virtual production during filming. Having explored its possibilities with MPC’s
The Jungle Book, Favreau then took this one step further, using virtual production to create the live-action remake of
The Lion King.
Adding to the already ambitious task of re-creating a beloved classic, the film was to be completely CGI, with the exception of the infamous opening shot—the only real shot in the whole movie. The photorealistic look of the film, paired with the way the camera moves, makes it appear like a nature documentary—the camera taking the audience on the same journey as the animals.
For this to work, the team at Moving Picture Company (MPC) created a 3D virtual environment that encapsulated the entire Savanna landscape and mirrored the classic landscapes of the original film. From the elephant graveyard to Rafiki’s Ancient Tree, Favreau and his team could put on VR headsets and immerse themselves in the virtual space, finding the best locations and methods to create the visuals for the film.
Outside of their virtual world, the crew worked in a technological blank space called the volume. Within the space was a grid that had a series of virtual reality sensors, feeding information back to the computers, like where the virtual cameras were in the space.
The crew still used largely traditional production methods and equipment, like handheld cameras, dollies, and tracks. However, what they were seeing emulated how the final shot would look; they could see how the camera interacted with the virtual environment and make alterations then and there.
(For an in-depth look at the making of this film, see “Pride of Place” in the Summer 2019 issue of
CGW, available on cgw.com.)
This real-time set visualization allows filmmakers to focus on the story, rather than trying to create a shot without knowing what the finished product will be. Cinematographers can create and light their scenes and see the changes as they are made in real-time, much like they would in a traditional production.
This does create some ease while filming, since the crew are able to make tweaks or remove elements from the scene they don’t want on set, but this method doesn’t always achieve the highest-quality results. After the production has wrapped, there is still a great deal required of post-production teams and VFX artists to craft the visuals into the quality that’s usually seen on our cinema screens.
Walls of the future
Despite the heavy post-production work that’s still required, one of virtual production's USPs is its ability to unlock more creative collaboration earlier in the production pipeline. Traditionally, while most directors work closely with the VFX Supervisor on set, there is often still a barrier between the production and post-production teams. By utilizing virtual production, it allows for greater collaboration between these two points in production, blurring the lines of the traditional filmmaking we’re accustomed to.
This can be seen in a relatively new phenomena emerging in on-set production: the use of LED screens. The concept stems from the use of green screens, a staple in the film industry. It sees actors standing in front of giant green backgrounds, interacting with imaginary landscapes and creatures. While directors would otherwise have to wait until post-production to see all the elements of the film together, this all happens in real-time with LED screens, as photorealistic 3D landscapes are projected behind the actors.
While the technology has been used on a smaller scale before, it’s recently come to fruition with the release of
The Mandalorian, with it being used to create many of the key scenes. To do this, ILM created a volume called Stagecraft, consisting of wall-to-ceiling LED screens. These dynamic screens displayed the unique landscapes of the
Star Wars universe, and were rendered with the camera's positional data.
(For an in-depth feature on the use of Stagecraft for
The Mandalorian, see “Bounty-ful VFX” in Edition 1 – 2020 of
CGW, available on cgw.com).
This meant that as the cameras moved, the 3D scene would change to reflect the movement. This gave the image depth, eliminating the possibility of the background looking flat when being filmed. But, LED screens still pose slight restrictions to how the camera can move. There’s a limit to how close the camera can get to the screens as it could result in Moiré patterns or stripes, which is often seen when you take pictures of digital screens.
Despite this minor hitch, LED screens bring a host of new possibilities to the industry. One of these is reducing the need for on-location shooting. Not only does this save money, but it means crews also no longer have to wait for the perfect light or time to shoot—LED screens can show a certain time for as long as it's needed.
On top of this, LED screens present a perfect example of the collaborative possibilities that virtual production and its inherent real-time capabilities unlock within VFX. For this technology to be successful, artists have to create environments for the LED screens in advance. Once projected on the screens, these landscapes can be manipulated as the director needs, meaning artists and animators are needed on set, as well as if anything goes wrong—resulting in all the different areas of production coming together during production and all having an important role to play.
Real-time and beyond
Circling back, using virtual production on-set puts the tools of storytelling back into the filmmakers’ hands. Instead of pre-and-post production teams working separately towards the same goal, they can collaborate. Directors can see what is being created while filming, and make decisions about the action and the way things are shot in real-time, rather than reviewing the footage later down the line. Artists actioning the director’s vision are privy to these creative decisions sooner, creating the perfect conditions for unbridled creative collaboration across production teams and beyond.
Needless to say, the success of virtual production techniques in productions like
The Lion King and The Mandalorian, and the pace at which they continue to develop, marks a very exciting time for the industry—for directors, artists, and our viewing habits.
For more technology-related stories, see Foundry’s Insights Hub page
https://bit.ly/insightshubcgw
.