Thanks to its successful use on projects like Disney’s The Mandalorian, virtual production is taking Hollywood by storm, and it’s just a matter a time before you’ll find yourself shooting inside an LED volume. With the technology still at the cutting edge, there are some essentials to consider before budgeting, tech provisioning, and filming with virtual production.
Virtual production using large-scale LED or projection technology was already burgeoning before the advent of COVID-19, as evidenced by films such as Oblivion, Gravity, Rogue One and First Man, but the pandemic has kick-started the practice into high gear. Dramatically reducing the need for large crew footprints, and eliminating the need for travel and location work, “the operational impact of this technology cannot be overstated,” according to Noah Kadner, virtual production editor at American Cinematographer.
“Major productions that would have been shot in real-world locations or on green screens have been reconfigured to be partially or entirely shot on LED volumes instead,” Kadner notes. “These include Star Trek: Discovery, Star Trek: Brave New Worlds,Netflix’s 1899, Thor: God and Thunder, and Bullet Train.
“As of this writing, there are over 120 major LED volume studio facilities across the globe, and that number quickly increasing.”
Kadner has outlined an enlightening set of 10 tips and tricks for newcomers to virtual production in the frame.io Insider blog. Let’s break them down.
1. Fix It in Pre
As Kadner remarks, “Anyone who’s spent some time on a set or in an edit bay has heard the term “fix it in post.” But large-scale virtual production volumes are technologically complex, and to make them perform at their best the lion’s share of visual development must occur in pre-production. That’s a reversal of the recent norm where issues on set were fixed in post.
On a virtual production, schedules for films are pre-loaded with more time for pre-production and a less extensive post period.
“Many seasoned filmmakers aren’t accustomed to the idea of making every decision in terms of effects imagery before production occurs and may find the process counterintuitive,” says Kadner.
“Assets such as models, characters, 3D environments, etc., must be completely camera-ready before production starts. Along the way, this also means a lot more iteration and visual development can occur.”
Indeed, the Virtual Art Department, previsualization, and virtual scouting are all vital parts of the LED volume pre-production workflow.
“In many ways,” Kadner writes, the production day becomes about executing a carefully validated plan instead of best guess shots in the dark, as non-virtual productions often seem.”
2. New Production Roles
Two new production roles, Virtual Production Supervisor (VPS) and the Virtual Art Department, or VAD, are crucial to ensuring successful virtual productions, so it’s essential to understand how they operate.
“A Virtual Production Supervisor acts as the liaison between the physical production team, the Art Department, the VFX team, and the ‘brain bar’ (ILM’s term for its Volume Control Team),” Kadner explains.
He suggests that the VPS combines the roles of VFX Supervisor and Art Director. The responsibilities of the VPS include overseeing the Virtual Art Department during pre-production, and supervising the LED volume during production.
“The VAD is where all elements that ultimately wind up on the LED walls are designed and created. This area encompasses a traditional art department, with an emphasis on digital assets. The VAD is constantly creating objects which may be digital models, real-world props, or both.”
3. Avoid Looking Like a Video Game
Photorealism is the aim nine times out of ten, but the pitfalls of allowing the virtual environment to look like a video game are all too real. To avoid having your project look like something from Fortnite, photogrammetry is the go-to technique. It’s a method of measuring physical objects and environments by analyzing photographic data from which to construct 3D assets.
Kadner name-checks a few useful photogrammetry tools such as ML/AI software RealityCapture, and Epic Games’ free Quixel Megascans Library, which is “chock full” of 3D assets, textures, and full-blown environments created with photogrammetric techniques.
What’s great about photogrammetry, Kadner writes, is that it’s relatively simple to use. “The effort needed to create a photorealistic 3D asset from photogrammetry is often far less than making the equivalent from scratch digitally.”
4. Get the Most Powerful System You Can Afford
“The more GPU power in your system, the greater the level of detail in an environment you can have on your LED wall in real time,” says Kadner.
It’s not something you should have to worry about: A quality integrator can ensure you have a system that performs well and doesn’t blow its fans nonstop.
But “many of the key components and plugins for virtual production, such as camera tracking and LED panel support, are only available on Windows,” Kadner cautions, and if the volume has multiple surfaces you may need to employ multiple PCs.
Bottom line? “You can never have too much GPU power.”
5. Understand Pixel Pitch
Kadner explains that one of the most critical technical attributes of any virtual production volume, whether your building your own or working in a rental, is the pixel pitch of the LED panels.
“Put simply, pixel pitch is the distance between individual LED lights on the screen and is measured in millimeters,” he writes.
“Because you’re re-photographing the screen, the pixel pitch directly correlates to how the image looks. If it’s not dense enough, the image can look low resolution. Or even worse, you may see moiré patterns.”
(A moiré pattern is a ghostly “interference pattern” that occurs when a fine pattern on your subject meshes with the pattern on the imaging chip of your camera, creating a third pattern that can appear on your images as odd stripes and other patterns.)
“The higher the pitch is,” says Kadner, “the more likely moiré will appear when the camera focuses close to or onto the screen.
“For reference, the pixel pitch of the LED panels used on The Mandalorian is 2.8mm. But that screen is also approximately 20 feet tall by 70 feet across, so that the camera can be much further away and less likely to focus on the screens. If you are working in a smaller sized volume, this can become even more of an issue.
“Panels are now available at 1.5mm and even more dense, which can mitigate or eliminate moiré. The tradeoff is that the lower you go, the more expensive the screens become.
“So, there’s ultimately a perfect storm to consider which factors in pixel pitch, camera-to-screen distance, focal length, focus plane, camera sensor size, and content resolution to determine whether your footage shows a moiré pattern or not.”
6. Know How Many Walls You Need
But how do you determine if you need a full LED volume or a single wall LED stage? Kadner describes “a significant scale continuum between the simplest single wall, rear projection LED setup to the massive volumes used on The Mandalorian.
“In general, the larger the volume, the more expensive it will be to rent or to purchase if building from scratch. So, it’s critical to determine how much LED volume you need.”
As he explains, the choice in volume size and configuration “has a huge impact on interactive/emitted light.”
For example, actors and other set pieces placed in front of a single, flat LED wall will end up as dark silhouettes against the screen. But if you also employ LED sidewalls and ceilings and the like, “you will have emissive lighting falling naturally on your subject,” he says.
“But even if you don’t need or can’t afford an enveloping volume, it’s still very possible to create interactive lighting in sync with the screen content” with DMX lighting and pixel mapping techniques, as described in the section below.
7. Learn How to Use Interactive Lighting
A system for controlling lights and other stage equipment, Digital Multiplex or DMX lighting offers a significant amount of utility for virtual production, says Kadner.
“You can program specific lighting cues and colors with DMX directly in Unreal Engine or via a lighting board,” he writes. “Or, through pixel mapping, you can set any light on your stage to mimic the color and intensity of a portion of your 3D scene.”
This provides a number of interesting possibilities for virtual production, as Kadner notes. “For example, if you’re doing a driving scene, you could set up pixel mapping to sample a portion of your background plate footage and connect it to lights above and the sides of your picture vehicle,” he says. “You can mimic passing car headlights, street lamps, tail lights, you name it.”
Kadner recommends DMX-compatible lights that offer full-color control such as ARRI Skypanels, ARRI Orbitors, and Litegear LiteMats. Pixel mapping software is also a must-have. “Unreal Engine has DMX control, so you can control DMX lights directly from within scenes,” he notes. “Some other examples of external pixel mapping applications include Enttec ELM, Resolume, and MadMapper.”
The final element, Kadner says, is a DMX-to-computer interface, which can range anywhere from a simple USB box like the Entec USB Pro Mk2 to complex Ethernet networks fed by an interface box such as the Enttec ODE Mk2.
8. Master Color Measurement
Understanding color science, Kadner says, is integral to the cinematographer’s craft and essential to virtual production, which uses one digital device to capture the output — essentially re-photographing it — of another digital display.
The light cast by LED screens themselves can have unexpected and undesirable results for the unwary filmmaker, Kadner warns. “Watch out for metamerism, which refers to the visual appearance of an object changing based on the spectrum of light illuminating it. LED panels are designed to be looked at directly, not act as lighting sources.”
However, supplementing the emissive light coming off the LED panels with additional movie lighting can help offset the issue. “It’s more work to set up but the results are worth the effort,” Kadner says. He also notes that manufacturers are beginning to develop LED panels with better full-spectrum color science, which may make the issue entirely moot in the near-future.
Other important factors include finding out if your LED wall uses 8-bit, 10-bit, or 12-bit color space; employing a high-quality video processor from a company such as Brompton or Megapixel VR; and determining the color-space of your camera LUT. You’ll also want to ensure that you’re utilizing high-quality, color-accurate, and large HDR monitors on-set to evaluate the final image captured in-camera.
9. Don’t Think About Virtual Production as a Zero-Sum Game
There’s a lot of talk about being able to produce pixel-perfect final shots on-set and eliminating post altogether, but not everything can be captured in-camera, as Kadner explains. And even if the technology does advance to that extent in time, it may not be creatively desirable.
“For example, the percentage of final shots captured in-camera on The Mandalorian was around fifty percent on season one, according to ILM,” he writes. “The finality shot captured in an LED volume can vary from ‘ready to edit’ to ‘requires some additional enhancements.’ ”
In other words, don’t think of virtual production as a zero-sum game. “Think of more on a continuum of potential additional finessing in post vs. all or nothing,” he says.
“Most visual effects supervisors who’ve worked in LED volumes agree that it’s far easier to fix visual issues with in-camera effects shots than to start with green screen cinematography and add the entirety of the background imagery in post-production. It’s a reductive and refining process vs. starting with a blank canvas.”
READ MORE: The Mandalorian: a test bed for Virtual Production (ProVideo Coalition)
10. Embrace Change as the Only Constant
The pace of change of virtual production with LED technology and related areas such as AI, camera to cloud, 5G connectivity and volumetric photography inevitably means that as soon as you’ve locked the tech spec down for a project, elements of it will have advanced.
Kadner points to Epic Games’ latest release of Unreal Engine, which is accompanied by a host of tools expressly designed for the virtual production filmmaker.
What was completely impossible or highly difficult to accomplish one day may be standard operating procedure the next. “Each version offers advancements that will make things faster and more realistic in virtual production,” he writes.
“So, to save your time, sanity, and budget, embrace constant change. Attend many webinars, watch a lot of YouTube videos, read all you can, and above all, experiment.”
Discussion
Responses (1)