TL;DR
- LED cinematographer Erik “Wolfie” Wolford presents an in-depth demonstration of virtual production using LED walls at the Entertainment Technology Center’s vETC conference.
- Wolford’s demo perfectly matches real and virtual lighting to create a realistic illusion of an actress standing on a sunlit beach, all controlled in real time through Unreal Engine.
- His technical setup includes a HP Z6G 5G desktop workstation equipped with a Sapphire 1 Intel Xeon Processor and a pair of high-end RTX NVIDEA 6000 graphics cards running Unreal Engine.
- Kino Flo’s new Mimik lighting, designed to create full spectrum foreground lighting for virtual sets and overcome the limitations of LED walls’ light for skin reflections, automatically adapt to scene changes inside Unreal Engine.
One of the hottest topics of conversation in the filmmaking community of late has been about the use of LED walls, or volumes, in production.
But few people have explained it in the level of detail as LED cinematographer Erik “Wolfie” Wolford, who this past June shared the real nuts and bolts of how he handles virtual production. His talk, “LED Stage Architecture: How It’s Built,” was presented during a session of the Entertainment Technology Center’s virtual conference, vETC.
Wolford, who’s shot music videos and documentaries, recounts his career path starting at the bottom rung on the production ladder on music videos for such visionaries as Spike Jonze and Michel Gondry. He eventually moved into lighting for special effects, particularly for green screen work, on features, commercials and music videos before focusing his creative energies on virtual production in his current role of LED cinematographer.
His first job involving Unreal Engine was the 2022 animated short Mr. Spam Gets a New Hat from director William Joyce and international VFX company DNEG. The 2D animators working on the project, he recalls, had to adapt to the challenges of 3D in the virtual world. “I came in and moved lights around, changed the size of the lights to make them wrap better, added a lot of shadows, and soon I was hooked!”
As Unreal Engine started to be used more for virtual production or ICVFX (in-camera visual effects), he was all over it.
To illustrate his talk, Wolford created a setup with an actress positioned in front of a digital wall. The real lighting on the actress and surrounding stage was matched with a virtual set of a beach, with the game engine controlling the interaction of foreground and background in real time. So convincing was the setup that the actress appeared to be standing on a sunlit beach, complete with a cliff and the ocean in the background. The illusion’s success lay in the seamless lighting — every element, real and virtual, seemed to be lit from the same source.
He breaks down the setup for the attendees. Unreal Engine is running on an HP Z6G 5G desktop workstation with a Sapphire 1 Intel Xeon Processor (Sapphire Rapids) and a pair of high-end RTX NVIDEA 6000 graphics cards. One engine runs Unreal Engine doing all the 3D movement based on the position of the camera, “just like first-person shooter games,” he explains, adding that changes in the positioning, depth of field and focal length on his real camera are immediately translated to the 3D background.
Then, there is another box which the first one feeds the signal into and that sends the signal out to through an ATM switcher to yet another box, which serves as the editor node. This third box feeds the signal into a Brompton Technology processor, which takes the background image and breaks it into many squares to deliver each square to an individual one-by-one panel.
The signal then goes back to the record decks and feeds the Megapixel Visual Reality Helios LED Processing Platform running special new lights that Kino Flo lent for this demo, called Mimik Lighting, designed specifically to allow users to create full spectrum foreground lighting for virtual sets. The Mimik lights allow the user to use the same technology that the LED wall uses but purely to create interactive lighting on the set.
“When we work with LED walls,” he says, “they look great to the eye. They look great to the camera. But they don’t create light that’s great for skin reflections.” To create the illusion that the set wraps around and you want the same color space as the set, so the Mimik light gives a full CRI (color rendering index) image — essential when using LED panes as a light source. “The reds look really red; it looks pretty good on skin. I can simulate more walls giving me the color I want, and it helps tie the actor into the magic of the scene.”
While Wolford also likes to use a lot of lights from Aperture, which also have an excellent CRI, are efficient and portable, and are easily managed with a smartphone, but, he says, “it takes me about 10 minutes to tune in a color. Then, if I change the scene from daytime to nighttime, I have to go change all my lighting.” Instead, he explains, “the Kino Flo Mimiks let me put the Unreal scene right into the Mimik light. Then the Mimik light just automatically updates as I change [the characteristics of the] scene. If I go from day mode into night mode, it will go into night mode. If the scene [on the LED wall] is a sunset, it will automatically generate sunset light.
“These Mimik lights,” he says, “are half LED wall and half actual proper film set light.”
He elaborates on the specific scene he’s set up for the demo: “We’re doing a little campfire beach scene, so we have a virtual light on the screen behind doing a virtual flicker of fire on the wall.” Then he places a physical Mimik light on the actress, which helps sell the illusion that this is all a real scene on a beach.
He demonstrates the real camera and the virtual camera running off Unreal that captures the motion of the real camera as it trucks left or right or dollies closer to the subject. The scene on the LED wall compensates with all the appropriate parallax applied to the scene, again, as he points out, “just the way a first-person shooter game works.”
Digging deeper into the technology, Wolford notes that the virtual camera’s positioning is assisted by a mo-sys box using its IR reader, which “observes” little stickers on the real camera by sending out an IR beam and uses the IR reflection to determine the relationship of the box and the real camera in order to translate that into positioning date for the virtual scene.
Capturing lens data, such as focal length on the zoom and aperture, is either done using a similar device on the camera’s lens that makes use of a gear apparatus to track the positioning of the lens barrel and sends that data into the Unreal Engine or, in the case of some newer lenses, that type of lens metadata is captured directly as part of the lens mechanism. Either way, zooming in or out, opening or closing the iris is translated into data, fed into the Unreal Engine and the image on the LED wall is affected accordingly.
At this point he explains an important word in his field — the frustrum, which refers to the area of the LED wall that the real camera is seeing at any given moment. “It’s really computationally heavy to generate all the date for the entire wall,” he elaborates. “Any way we can save on computational power, we do, and one important way by only generating data on the screen when and where we need to see it.”
Explaining his methods for lighting performers, sets and props in front of the LED wall, Wolford says he relies on a lot of techniques he learned for lighting the foreground elements on a greenscreen stage. The idea of interactive lighting that sells the effect and makes the live action elements seem to exist in a common space as the background effect is an equally vital aspect in both types of VFX cinematography.
“If I’m lighting a greenscreen [stage] for a J. J. Abrams movie,” he says, “they’ll send me a picture of the effects background” — a style guide, he says — “and I’ll be like, ‘OK they’ve got sun coming from the left, there’s a big fireball that’s going to happen on the right and it’s more a reddish fireball than an orange one… so I’ll put a key light from the left and a reddish effect light on the right to simulate the fireball.
“Now I just look at the LED wall,” he says, “and I have my style guide right there.”
At the conclusion of his presentation, Wolford took a series of questions. Answers revealed additional tidbits, including the fact that the wall the audience was watching utilized a 1.9-pixel pitch (the distance from the center of any one pixel to the center of an adjacent pixel is 1.9 mm. The smaller the number, the higher the resolution of the image). To contextualize, he explains, “You go to Coachella and see a big video wall. That’s going to be a 3.9 pitch. This is 1.9 pitch, so these pixels are very tightly packed together.”
Asked about Unreal Engine’s primary competitor, Unity, he said that Unity’s graphics engines have dominated the cell phone game area and that the company made a significant move into high-end VFX when they purchased highly-respected New Zealand-based VFX company Weta Digital (the lead VFX house for the Lord of the Rings films), but he stresses that Unreal Engine and parent company Epic Games, “Have been very aggressive in the video wall space, through training and teaching,” so while he’s aware of that some LED walls based on Unity technology exist, he says they are rare and he’s never encountered one.