Dimension Studio bridges creativity and on-set systems to help clients produce content using virtual production workflows. Virtual production is making important changes to the way film, episodic and live broadcast content is captured and created. Since October 2020, cinematographer Jim Geduldick has been Director of Virtual Production and Sr Vice President at Dimension Studio North America, a relatively new location for the company which is based in the UK.
Also in 2020, Jim helped Dimension and DNEG collaborate on their ‘Gunslinger’ virtual production and LED volume test. Now, under Jim’s direction, Dimension North America is rebuilding traditional production pipelines with new, real-time techniques including volumetric capture powered by game engines.
From classic rear and front projection to the use of trans lights for interactive lighting, virtual production has in fact been used across the media and entertainment industry for decades. Rapid evolution and the greater accessibility of these techniques in recent years has led to wider adoption of real-time workflows. Game engines like Unreal, Unity and CryEngine have advanced, especially in terms of their capabilities for use in real-time workflows, further supported by upgraded GPUs and more sophisticated video I/O equipment.
“For creatives and technical teams, we now have robust tools that allow us to be more proactive with workflows, blending between previs and the virtual art department,” Jim said. “The speed and control with which we can iterate with real-time tools, going from pixel to hardware and out to an LED or traditional green screen, has rapidly evolved. If you look at the past 18 months alone, virtual production has generated a lot of excitement in terms of its potential, both for filmmakers now and into the future.”
As a virtual production supervisor, Jim’s work blends the traditional roles of VFX Supervisor and DP, while acting as a bridge between different departments to make sure that digital and physical set elements mesh together in real-time during live productions. He leads Dimension’s virtual production team and a core group of engineers and creatives who collaborate on-set to create and send images onto LED walls surrounding the stage.
Hands-on with Hardware
On set, matching a real-time game engine with the AJA KONA 5 PCIe video I/O card, which has enough performance to help Jim’s team accurately combine live elements and CG assets with virtual environments in real-time. He said, “The essence of real-time workflows is that we’re not waiting anymore while our images render, like we used to. Instead, while we’re still on the set, we’re pushing as much of the process as possible into the pipeline through AJA hardware and other gear to make everything quick and interactive upfront.
“While it’s up to us to design the workflow, the KONA 5 capture cards are our workhorses that send high-bandwidth, raytraced assets to screens. This ultimately allows us to make on-the-fly creative changes during production, which saves clients time and money because we’re capturing final effects in-camera, rather than waiting until post.”
For Dimension’s clients, adopting real-time virtual production workflows leads to many different options and also advantages. Making critical decisions earlier in production requires prudent planning and previs, but results in cost savings. Jim finds that virtual production gives him a lot of freedom in his workflows. He said, “To meet continuity and budgetary demands, shooting in an LED volume means I can capture the same time of day, with the same colour hue, at any given time, and meanwhile the actors and crew members can focus more on the performance.
“It makes us more flexible when we are asked to accommodate scene changes and versions. For example, if we need a pickup or if the screenwriter or director adds new lines, we can re-load the environment, move the set pieces back into place and shoot the scene. No permits or challenging reshoots are required which, again, gives teams more time to spend on the creative aspects of production, rather than the logistics.”
A part of Jim’s role is to safeguard productions in case of accidents, power outages or downtime. “Depending on the workflow that we’re using, I might have the game engine record certain graphics or metadata coming in from sources. For example, if you have background and foreground elements – CG and live action – via a Simulcam workflow, being able to record that composited image as a safety for editorial dailies and VFX is very helpful,” he said.
“I’ve also found that being able to ingest composites back into our systems through AJA I/O hardware like KONA 5 leaves us options to send a signal downstream if we need it, as a safety feature. This safety allows directors, VFX supervisors, and anyone on set to request and view playback through our system. It’s not a requirement, but since it takes a village to create a TV show, indie film, documentary or even a live broadcast, you never know when you may be called on at a very integral time.”
Driving the Narrative
As systems both become more accessible and continue to advance, Jim is optimistic that their potential will expand for the wider industry. Currently, VP is helping producers improve on what they do already. However, he anticipates that the next evolution of virtual production tools will help innovate the way narratives are driven and help tell stories in more impactful ways.
“The north star is the continual merging of imagination and possibilities and, fortunately, so many resources are available to help users learn how to use real-time systems. There’s never been a richer time for self-education, so I encourage anyone interested in virtual production to dive in and devote a bit of time and effort to it.” www.dimensionstudio.co