Watch “Decoding Broadcast: A Call to Filmmakers and Cinematographers Bridging Worlds” at NAB Show New York 2023.
TL;DR
- Elements of broadcast and feature film are fusing as evidenced by the introduction of cine-style depth of field cameras into live sports.
- Cinematographers are beginning to work within live broadcast although there are still huge differences in workflow, pacing and language they need to get to grips with.
- It is thought that more cinema camera companies will open up their systems to allow for tighter broadcast integration.
The conventional wisdom is that live broadcast and cinema production will never meet, but the tools and the craft skills are beginning to blur. While there remain key cultural, equipment and workflow differences it seems as if there’s greater convergence ahead.
Mike Nichols, CEO at Surella, defines the core difference in terms of logistics and aesthetics.
“If ever there was an illustration of left brain, right brain, it’s that the broadcast execution looks at how we do it from the technology side, whereas in filmmaking, the tools are there to support a creative vision,” he said in the session “Decoding Broadcast: A Call to Filmmakers and Cinematographers Bridging Worlds for Filmmakers and Cinematographers” at NAB Show New York.
“When you’re executing broadcast, it’s really about the nuts and bolts: Does this signal get to this truck? It’s not as if the broadcast mentality doesn’t care about image quality, but it’s not really the primary driving force behind the execution. Whereas in film, you’re approaching the job to look beautiful, cinematic, aesthetically pleasing.”
Nichols is a 20-year veteran of the production and production resource industry, including a dozen years at AbelCine, where he helped grow the company in the large format multicamera space.
This year he launched Surella, a production company with strategic partnerships in the live multi-cam and immersive market.
His contention is that elements of broadcast and feature film are fusing. Perhaps there is no greater evidence of this than the introduction of cine-style depth of field cameras into live sports. He also has some personal experience of projects that have crossed his desk recently where cinematic style Look Up Tables were being applied to multi-cam studio setups.
There remain huge differences, however, not least if you are a cinematographer or a broadcast TD looking to crossover into the other’s world.
“In the broadcast world, you just don’t have time for the nuances of color space. You have to plot out everything well in advance in a way that is very different than storyboarding a film with your director. It’s a completely different pace. And the language that’s used is pretty jarringly different.”
He reports that there are more and more cinema DPs being brought in to do live, multi camera broadcast-style shoots but they are finding it a culture shock if they’ve never been in that environment.
“The good thing is some of the tech producers, the engineers that work at the high level understand both worlds and do a really good job of holding the hands of those cinema DPs because if not it could be very jarring,” he says.
“[In live broadcast] you’re under such immense pressure to nail it, you don’t have a second take.”
He advises DPs with no experience of broadcast but keen to get involved to listen to the communications / talkback between camera-ops, technical directors, vision mixers, replay ops and the director calling a show.
“If you really want to understand the live environment just listening to the comms and hearing how the communication happens during a live show. It’ll blow your mind. The main focus is collaboration because if you don’t do that, you’re going to be spinning off in different directions and not in sync.”
Nicols also talks up the skills of the camera-operator working with cine gear like Sony FX cameras in a live environment. It’s not easy to get the focus right in a split second.
“You’re pulling focus but you’re not on a 30-inch lens where you can kind of snap right to it. You’ve got to find it and it’s changing the way operators are approaching their work. It’s changing the way directors are calling shows because they know they have to give an extra beat when they cue up the next camera in their preview in the multi-cam. They know to give their operator that split second extra to find that focus. Because as beautiful as the cinema lenses are, the shallow depth of field is a lot more challenging for the operators.”
All in all, he thinks it’s easier to integrate the cinema workflow into the broadcast environment than it is to bring a broadcast environment into a cine workflow where DPs have to command bringing in more crew, a Steadicam op and focus puller, plus additional camera assistants.
In a recent job by Surella, Nicols reports shooting a concert at Radio City Music Hall using the same cameras, workflow and team for the concert as for the feed produced for the giant screen IMAG.
“It was great because it gave us this freedom to do this thing that really isn’t typically done where we we’re producing the IMAG but we we’re also cutting a concert film. And people really responded well to seeing what looked like a concert film on the big screens as opposed to just every shot being just of the talent. It really worked. I wouldn’t be surprised if we start to see more IMAG and concert film merging.”
Camera and videos systems developers like Blackmagic Design are already targeting crossover kit for this space. RED Digital Cinema has also recently developed cameras and systems aimed at live (notably live VR).
“I think there’s going to be more of a call to action for the companies that consider themselves more street level in terms of the market to be more active in this space. Blackmagic’s entire ecosystem is great. RED is sort of seeing the value in creating their own next tier ecosystem of multi-cam. But I think more and more the cinema companies are going to see that they need to open up their systems to allow for tighter broadcast integration.”