- If you want to see the HBO Camera Assessment Series, register for NAB Show New York, sign up to attend the screening and bring your badge to the free off-site event on Wednesday, October 25. Only NAB Show New York attendees are eligible for the viewing. Find full details and how to register here.
- NAB Show New York will then host a hands-on follow-up, The Making of The HBO Camera Assessment Test (CAS) Seminar, on Thursday, October 26, at 11 a.m. on the show floor at the Javits Center featuring CAS leads Stephen Beres and Suny Behar. They will explain the testing methodology and dive into the technology advancements that changed the style and type of analysis required.
Currently in its sixth installment, the HBO Camera Assessment Series is a feature-length movie that employs staged scenes that each clearly demonstrate the strengths—and weaknesses—of cameras such as the Sony Venice 2, the RED V-Raptor, the new ARRI Alexa 35 configuration, the Blackmagic Ursa 12K, and many more.
Of course, the nature of the gear has certainly evolved since HBO started doing this as a deep dive into digital cinematography, which was only then seriously starting to challenge motion picture film as the most viable medium for the network’s shows.
Cinematographer/director Suny Behar has overseen these assessments from the start. Together with HBO and under the leadership of Stephen Beres, senior vice president of Production Operations at HBO, MAX and Warner Brothers Discovery, Behar has created new installments in the series when the state of camera technology has advanced enough to warrant it.
“When we started 10 years ago,” recalls Behar, “a lot of the questions weren’t about comparing the performance and the quality of the cameras as much as it was comparing whether or not some cameras even could perform.
“There was a vast difference between a camera that could record even 10 bit 4:4:4 versus a [Canon] 5D that was 8-bit 4:2:0, so you couldn’t do green screen work; there was significant motion artifacting; and it was difficult to focus. Those larger differences aren’t what we’re looking at now because all the cameras can do at least 10-bits 4:2:2 minimum. They at least have 2K, super 35 sized sensors.”
There continue to be differences, some quite significant, among the tested cameras, Behar adds, “but it’s in different realms. The tests are no longer about [finding] where the picture just breaks, but as people expect more, there are other issues we investigate.”
There are circumstances people wouldn’t have tried to shoot a decade ago that are becoming standard expectations of a DP.
“You are going to care about signal to noise if you’re trying to shoot with available light, where some cameras will be significantly noisier than others. In the world of HDR, if you’re shooting fire or bright lights, you are going to care about extended dynamic range in the highlights, if you hope to not have to comp all your highlights in with the effects because [the highlights] broke.”
Stephen Beres explains that these tests, which have screened at various venues, serve as the start of discussions for his networks’ productions, not as any kind of dictate.
“We don’t have a spreadsheet of allowed and disallowed,” Beres explains. “What we have is projects like this, so when we sit down together — the studio and the creative team on the show — and we look at these kinds of things as a group, it can help us start the discussion about the visual language of the show. ‘What visual rules should be set up for that world that that show exists in?’
“And then we sort of back that into the conversation about ‘what technology are we going to use to make that happen?’. And that’s not just about cameras. It’s the lensing. It’s what we do in post, and it’s how we work with color. It’s how we work with texture. All those things go together to create the visual aesthetic of the show.”
Once they complete a new installment in the CAS, the company is delighted to share the results with all who are interested. Beres and Behar have both taught about production and post on the university level, and they clearly enjoy sharing their knowledge.
The Assessments
A great deal of thought goes into designing these camera tests in order to display apples-to-apples comparisons, with elements such as color grading and color and gamma transforms all handled identically.
“I think all of the cameras we tested this time shot RAW,” Behar says, “so then you have to make decisions about how you’re going to get to an intermediate [format for grading].”
They decided to use the Academy Color Encoding System (ACES) as a working color space. While there are certainly some people in the cinematography and post realms who still have various issues with ACES, Behar says, it has been useful in some ways because ACES forced every manufacturer to declare an IDT whether they liked it or not.
The IDT, or Input Device Transform, along with the ODT (Output Device Transform), provides objective numerical data quantifying the exact responses of a given sensor so that it can be transformed perfectly into ACES space.
While some manufacturers were reluctant to subject their sensors to such scrutiny (where little tricks involving after-the-fact contrast and saturation, etc., can’t hide their flaws), all did come around because of the growing adoption of ACES and its support from the Academy of Motion Picture Arts and Sciences and the American Society of Cinematographers.
Because of this, the ACES imagery upstream of any color grading really does provide a look into a sensor’s dynamic range, color and detail rendering.
Then, the CAS did the same across-the-board grade (no secondaries, no Power Windows) and transform to deliver final rec. 709 images for all the tested cameras to test many of the different sensors’ attributes and liabilities. Next, to test in HDR, they derived a PQ curve from the same picture information and opened it up without any further adjustments.
“The only test that we did not go through that exact pipeline for,” says the cinematographer, “was the dynamic range test. I’ve always felt that the ACES-rec 709 transform is too contrasty, meaning it has a very steep curve and a very high gamma point, which tends to crush blacks and push up mids. It does give you a punchy image, but if we’re testing dynamic range, and especially in low light, the first question the viewer would have would have been, ‘is there more information in the blacks?’ or ‘how did you decide what to crush?’ and those are very valid points.”
For this, Behar shot a very large number of test charts, that gives them the ability to map their own gamma transform. Shooting in log form at key exposure and at many steps over and under, the team is able to lock in an across-the-board standard for middle gray based on each camera system’s log profile. Once each camera is set up for perfectly exposed middle gray, the tests of over- and under-exposure can be objectively compared.
Given that a number of the cameras tested reached approximately 18 stops of dynamic range, I enquired whether such a capability is overkill. Circumstances where a cinematographer would actually use that much dynamic range are few and far between. More likely, they’ll want to use lighting and grip gear to limit such situations, as they always have.
“That’s right,” says Behar. “I think most DPs won’t need more than 12, maybe 13, stops of dynamic range to tell a story. You can’t hide a stinger in the shadows if you’re seeing 10 stops under. You can’t have a showcard in the window if you’re seeing 12 stops over.
“But then it stands to reason that the camera manufacturers should allow us to use that information to create soft knee rolloffs and toe rolloffs for lower dynamic range, but with beautiful rolloffs into the highlights and the shadows.
“You can’t create a look [digitally] that is like Ektachrome, with maybe four stops over and three and a half under, if you’re clipping at four stops. You need to burn and roll and bleed and have halation. With the dynamic range on some of these cameras we’ve tested, you can do more than just light for an 18-stop range.”
Behar and Beres both take great pride in these CAS films, which are shot and produced to feel like high-quality HBO-type programming, not just charts and models sitting in front of charts.
“This is real scenes with moving cameras, moving actors,” says Behar, promising the cinematography and production is of the highest caliber. “The number one feedback response we’ve gotten so far has been, ‘Holy crap! I thought this was going to be a camera test!’”
If you want to see the HBO Camera Assessment Series, register for NAB Show New York, sign up to attend the screening and bring your badge to the free off-site event on Wednesday, October 25. Only NAB Show New York attendees are eligible for the viewing. Find full details and how to register here.
NAB Show New York will then host a hands-on follow-up, The Making of The HBO Camera Assessment Test (CAS) Seminar, on Thursday, October 26, at 11 a.m. on the show floor at the Javits Center featuring CAS leads Stephen Beres and Suny Behar. They will explain the testing methodology and dive into the technology advancements that changed the style and type of analysis required.
Additionally, post-discussion sessions and demos will feature a Sony Venice 2, ARRI Alexa 35, Panasonic AK-PLV 100 and a pair of Sony FR7 camera packages, along with gear from Mark Roberts Motion Control and Vinten, and supporting sponsors Fujinon, LUX, Multidyne, and Seagate.