From “Fathead,” produced by Tom Thudiyanplackal
TL;DR
- AI for production should be thought of and used as an assistant, says technologist and producer Tom Thudiyanplackal, because currently generative AI tools only deliver approximations.
- There are dangers in mixing the iterative nature of AI with the non-iterative nature of the day of actual shooting in a virtual production set.
- Studios are building their own AI models trained on data they own to speed production and safeguard against IP challenges.
“While tools like ChatGPT and Midjourney are being made free to us today, soon, they won’t be free,” warned technologist and producer Tom Thudiyanplackal. “So, we don’t want to be too addicted to them or too dependent on them.”
Thudiyanplackal produced the experimental virtual production short film Fathead for ETC (Entertainment Technology Center). He believes the intelligence that we’re transferring to the AI is really a human intelligence.
READ MORE: “Fathead:” Virtual Production (Almost) Completely in the Cloud (NAB Amplify)
“We should never forget that. It is always machine learning, it’s learning from us. So we have to retain our intelligence, we have to retain what we wish to achieve with certain tools [and retain] goals for our creative outcomes.
In a presentation for the vETC virtual conference at NAB Show, Thudiyanplackal shows how it is possible to script a scene and create a character using text prompts. In doing so, he is highlighting AI’s limitations and the need for humans to create something worth using. Watch the full presentation in the video at the top of the page.
“It’s able to do some of these things pretty well [but] there’s still a certain degree of accuracy that’s missing. These are the places where you cannot wholly depend on it. You still need to have your basic knowledge in whatever area that you’re wanting to use an AI for.
“So, you need to know how good screenplays are built, what dramatic structures [look like], what good plots are, how to build good information that makes a character interesting to the audience. There are no shortcuts for those things, so you do need to do the legwork.”
Human talent and knowledge of story is needed to decide and build into an AI all sorts of nuances including exactly what scenario the drama is playing out. What’s the conflict and how does it unfold in that character’s journey?
“Those are things that you’re weaving into the story and not something that’s going to come from the AI software. These are things that you do need to go and learn from masters, observe their work, watch a lot of movies, go work with somebody as an assistant.”
Where Thudiyanplackal believes AI scores currently is in ideation (such as ideas for the design in pre-production stage of marketing posters) and the ability to give everyone the tools to draw, storyboard or create 3D models. Even here, though, he advises users to be wary of certain anomalies that will occur once in a while.
“If there’s a cow standing on one side of a building, the front of it sticking over here, and there’s another cow standing on the other side of the building, with just the back of it sticking out, the AI could just counted that as one cow — one very long cow. Those errors have to be tracked by a human. With all of these AI tools it starts with approximations and then you have to keep massaging it to get to a certain level.”
This matters a lot if you’re are building scenes and characters for eventual use in a virtual production set-up with live action. It is becoming increasingly clear to all who use virtual production that success using this method is all about pre planning.
You cannot arrive on a virtual production set with an approximation of what you are going to shoot — it needs to be exactly what you are going to shoot.
“A lot of people heard the early advertising around virtual production and got very excited believing that at the flip of a button, we could be at Rome in the morning, and then Paris doing a night sequence. The issue is that directors would show up on the day and request for different variations, which wasn’t discussed prior… and they would get disappointed because those changes are not possible,” he said.
“You couldn’t request for something on the wall that we hadn’t already planned, built, tested, and pre-written.”
In other words, even with AI, we are not yet at the stage of real-time production on a virtual set (although there are tools such as Seyhan Lee’s Cuebric that purport to do just that).
Thudiyanplackal also tackled the questions of copyright and ethics and the legitimacy of algorithms training on human artists’ data. He says studios have been very careful to date about using AI because they don’t want to take the risk of breaking the law.
“So, when they want to experiment with AI in their pipeline, they’re being very specific,” he says. “Rather than using this very general tool, which applies to an entire project or an IP, they’re more interested in something very specific where an AI generates quick 3D meshes to build 3D environments, for example.”
He cites the case of AI being used to de-age Harrison Ford in Indiana Jones and the Dial of Destiny. “That’s a very specific thing that they’ve done. They use their own training data, because Lucasfilm had done so much work with them. They had all that training data from that age that needed. It all comes down to that baseline training data, to protect IP and to be safe.”
He says that studios plan on building their own AI models using proprietary training data. “That why [studio] adoption is slow, but those are things that are in discussion.”