‘The Irregulars’ Netflix series follows a street gang of teenagers in Victorian London who solve crimes for Doctor Watson and Sherlock Holmes. More than an adventure story or crime drama, the show takes a supernatural turn as the characters pursue ghosts and discover their own mysterious powers. Previsualisation was critical to bringing the story to the screen, particularly when determining how to visualise various supernatural phenomena.
VFX studio NVIZ worked on various stages of production, including previs and VFX, and took advantage of their expertise in the use of Ncam camera tracking and augmented reality (AR). NVIZ has built a proprietary AR system based on Ncam’s open-source AR Suite plugin for Unreal Engine.
As well as meeting the initial previs requirements, the previs creation process itself played a role in planning the shoot, which took place in the early days of Covid-19. Working under Production VFX Supervisor Richard Briscoe and Associate VFX Supervisor Adam Rowland, the production had just started on the previs when restrictions were imposed. The set was shut down without knowing when live shooting would resume. Nevertheless, the filmmakers wanted to be prepared, and proceeded to carefully plan the shoot and set build.
Live Remote Sessions
At that point, NVIZ offered the production the option of live remote sessions to create the previs, blocking out the sequences in real time over a series of video calls. Janek Lender, NVIZ’s Head of Visualisation, said, “In the end we were able to create six previs sequences over just five weeks.”
The video sessions involved the Director Joss Agnew, DOP Nick Dance, Richard Briscoe and the VFX Producer, communicating via conferencing software to lay out the sets and start designiing previs shots. Later on, the Production Designer, camera crew and grips also contributed. Janek Lender drove the camera through Unreal Engine in Editor mode and broadcast his screen. The filmmakers discussed the animation, brainstormed ideas and made creative decisions shot by shot.
NVIZ were able to get this new pipeline up and running very quickly. “All of our previs pipeline was moving into the real-time world anyway,” Janek said. “We knew we could offer this kind of service but this was the first time that it was the only way to create the previs, which added an edge to the task.” The scheduling was carefully worked out. Animators who were working remotely would send their animations for the CG to Janek, who would ingest them into the live scenes, create shots and sequences from them and then use those scenes in the live sessions.
Tearing Space and Time
A key previs sequence for NVIZ focussed on an effect called the Rip that occurs in episode 8 when the sinister Linen Man opens a tear in space and time. The Rip serves as a gateway between the natural and supernatural worlds. Through it, Sherlock Holmes looks for the mother of his children, who is trapped in another dimension. Large and pulsating, it opens inside a dark cave and illuminates the environment with an electric blue light. As the scene progresses and the Rip grows, the cave is bathed further in the blue light which changes over time.
It was important to previs this sequence carefully in order to fully understand and plan the lighting on set and the effects that were based on it. On set, NVIZ’s Virtual Camera Operator, Eduardo Schmidek and his team used Ncam camera tracking combined with NVIZ’s in-house AR system. Owing to Ncam’s ability to keep track of exactly where the camera was located in the real and CG environments, they could overlay the CG look development asset for the Rip and its animation from previs, directly on top of footage captured by cameras on set.
“As soon as we add the AR Suite and NVIZ AR tool plugins to the previs scene created in Unreal, we connect the camera data to our system and do an initial line-up with the shooting set,” Eduardo said. “The placement of the previs element is interactive and very easy to adjust. This allows us to react to creative decisions, both in terms of the alignment and of other adjustments to CG elements, enhancing the composition on a per-shot basis. We also send a live compositing feed back to the stage, which the DP and camera team can then use to visualise their shots.
Because they could see a real-time preview of how the Rip would affect lighting, characters and environments as it moved, Joss and Nick made creative decisions on elements such as framing or camera rigs during principal photography instead of relying on imagination alone.
Hugh Macdonald, Chief Technology Innovation Officer at NVIZ, described their proprietary AR tool that runs inside Unreal Engine. “It was built to allow as much flexibility as possible, giving full control over assets in the scene as well as live grading and keying flexibility,” he said. “It uses Ncam's AR Suite Lite plugin for camera tracking and video I/O, and then follows its own render pipeline using the Unreal Engine frameworks."
These include Live Link that instantly sends live the camera tracking data to Unreal, Sequencer to record and replay the camera tracking, and Timecode Synchronizer that aligns all data sources together. Creating their own tools gives NVIZ the flexibility to extend Ncam’s functionality and make sure that what they are offering is more closely aligned with their clients’ needs.
Integrated Team Work
Eduardo also believes virtual production is ideal for integrating and combining elements from different departments, such as the art department, production design, the DoP and camera crew and the VFX team. “It gives people on set a good sense of how the final image will look,” he said. “For ‘The Irregulars’, it meant we had a very good representation of the final look for the Rip, even if it would still go through further refinements in VFX before the final shots were done.
“That integration also meant the practical light from set matched the CG Rip very closely and, once we were on set, we could do a fine-tuning pass of the colour correction with the VFX Supervisor to make sure the Rip’s look was matching his expectations.
“Another technical bonus that we get from using Unreal Engine, specially with this type of effect, is that even when we masked the CG Rip behind the scanned elements of the set, we would still get a glow from the CG over the live images, which helps to blend them. This also meant that we would still see real shadows from the practical lights under the haze coming from the CG element.
“Because of the way we are able to integrate the work done with Ncam with our previs pipeline, we arrived on set with the same light and lookdev that had been iterated through the previs stage, as the starting point for our Ncam schedule. It also involved the lighting department, using what they planned for their lights on set. Once we arrived on set, we were able to refine any CG lights to match what was effectively executed by the DoP.”
The production team were pleased with how well it worked. Even in the constrained schedule leading up to a return to live shooting, they made time to run the sessions most days and go through the shots. Because the director and the DoP were able to manually drive the cameras in the real-time engine fairly easily, it wasn’t only used for effects shots but also for drama, working as a tool for shot by shot development.
The director was able to go into action each time with a clear plan of what he wanted to capture, making the previs the blueprint for the shoot. It was also useful logistically, to keep times on set to a minimum, calculate the minimum number of crew members or actors needed for each sequence and where they were all going to be on set. nviz-studio.com