Cooke Optics has partnered with EZtrack to develop a coherent, integrated system that changes the approach to conceiving and making movies into a process that happens much closer to the set, in much closer to real time. The combination of Cooke’s research and development into accurately mapping its lenses, combined with EZtrack’s camera tracking and data aggregation hub, has helped to blur the line between production and post-production.
It allows filmmakers to merge images from on-set cameras with high-quality CG elements in Near Real Time (NRT), giving directors the confidence that a shot will work, not in days or hours, but in a few moments.
An integral aspect of the project is Cooke’s /i Technology maps, which allow precise, accurate manipulation of the digital image. The mapping information is used to quickly correct the lens’ distortion of the image. CG elements are composited into the undistorted image, and then the complete image can be re-distorted, so that the CG elements are also distorted to match the way the lens would see it, if it were really on set.
This result is, in turn, combined with the EZtrack system’s ability to precisely track camera movement and aggregate it with lens and camera data, plus the movement of many other on-set elements such as mechanical cranes – as the moves occur. This allows the VFX and camera teams to accurately follow the position and movement of the camera at all times, while also giving access to all of the Cooke /i Metadata.
A director can sit on-set watching the feed from the live cameras with a real time version of a basic CG render for the sake of shot design. But more important, due to the collaboration of Cooke and EZtrack, all the data collected can be used by a VFX team near the set to produce an updated rendering of the required CG in Unreal Engine. While not ready for cinema screens, it is easily good enough to allow the editor to see their vision come to life in Near Real Time.
This gives filmmakers confidence in their shots, but also gives them the freedom to re-think a shot if necessary – not in the post-production phase, when re-shoots become extremely expensive – but in the moments after the shot has happened.
NRT in Action
This innovation in the cinematography process debuted on the Italian film Comandante, an ambitious project that not only benefited from this new approach to data capture and management, but also featured a 1:1 replica of a World War II submarine where most of the action takes place.
The project used Cooke’s 1.8x Anamorphic/i Full Frame Plus lenses as well as /i Technology, which enabled Cooke and EZtrack to provide spatially accurate, frame accurate distortion maps, that track focus pulls across time for spherical and anamorphic lenses.
The film’s story put extreme demands on the production, calling for most of the action to take place on the open sea. Visual effects designer Kevin Tod Haug said, “As we couldn't be in the middle of the ocean, the backgrounds could never be real. The foreground would also have been extremely difficult to manage because of wind and rain and water.
“Instead, we collected relevant set data from many different sources in different ways – sensors on the camera, the lens data, mechanical cranes, infrared sensors, rotary encoders, DMX lighting control and so on. By working with the EZtrack hub, all of these became one set of useful data that could be used in real time. Because the production wanted to work with anamorphic lenses, the immediate choice was Cooke lenses, which fortunately was what the DoP also wanted from the beginning.
Pulling It All Together
“Eztrack’s NRT workflow is able to take advantage of lens technology and data aggregation. Timothee de Goussencourt, Eztrack’s CTO working as the movie’s tracking supervisor, was the key to bringing the Cooke lens data to set and is part of why the process works – it makes visual effects accessible to the filmmakers. Everybody is involved in it as it is created, and knows what is going on because they can see it on set.”
Cooke and EZtrack co-developed their own aspects for the workflow, which fed into The Foundry’s NUKE, the compositing software used in the film’s Near Real Time (NRT) workflow. Cooke worked alongside EZtrack, The Foundry and Kevin Tod Haug. Cinematograper and VFX supervisor David Stump, ASC, took the role of workflow supervisor for this project, helping to realise this creative and technical achievement. Other VFX vendors involved included Italy’s Wow Tapes and Bottleship from Bulgaria, and the producers are Indigo Films and O’Groove Productions.