NVIZ created previs for several sequences for the recent film version of ‘Matilda the Musical’. The protagonist Matilda is under pressure from all sides – from her dysfunctional family at home and an abusive headmistress at school – compelling her to invent elaborate stories with fantastic consequences that somehow blend into her real life.
The main story that Matilda pursues as the movie gets underway is a tragic romance concerning an escapologist and an acrobat who marry and together perform a daring circus act, high above a pool of sharks, involving high-wires and cages. Their story culminates in a dramatic sequence that NVIZ were asked to previs.
For this performance, as if the pool of sharks were not enough, the Acrobat also wears sticks of TNT clipped into her hair for added excitement. However, at the critical moment she loses her footing, and while the fuse rapidly burns away, the Escapologist, locked in chains inside a cage, must free himself in time to save her as she begins to lose her hold.
Emotion and Suspense
The sequence aimed for extreme emotion and suspense, calling for an ambitious production. An important part of the previs process was to test various assumptions, before shooting, to determine whether their plans were not only possible, but also feasible within the limits of a film set.
Previs was supervised by NVIZ’s Head of Visualisation Janek Lender, who came on board in 2020 to work with Director Matthew Warchus, Cinematographer Tat Radcliffe and VFX Supervisor Simon Stanley-Clamp to help work out this complex scene before the production went to set. The filmmakers used NVIZ’s proprietary virtual camera system called ARENA to set up all the camera moves needed to effectively capture an acrobatic performance and to plot out the action. The sequence went through several iterations as ideas were presented, problems were resolved and lights, action and camera moves were explored and determined.
NVIZ develop and use ARENA’s software to operate a virtual camera inside a scene in Unreal Engine, allowing them to capture and explore various shots. It involves two operators – one works on the computer running the Unreal scene, using it to tilt the camera lens up or down, pull focus and alter the depth of field. This operator also starts, stops and controls the speed of the animation within the scene. The second operator works with an iPad, using it as a viewfinder to frame the desired shots, ready for capture, just as you would with a camera on a physical set.
The iPad has a VPN connection to the Unreal computer, which keeps all work private and encrypted. “It also means we can connect the iPad (or iPads) remotely – the camera in the UE Scene can be controlled by the iPads across the internet, while at the same time being on a group call to discuss the shots,” Janek said. Once a series of shots has been recorded, the director can make selects and NVIZ will cut these together to previsualise the sequence.
Perhaps the greatest value of ARENA is that it results from all of NVIZ’s experience in visualising camera work, leading to this practical approach to concepting shots and exploring how to film a story. It is unobtrusive, quick to deploy and needs very little equipment or people (just two) to be brought on set. They have also found that its simplicity makes it more robust, and have used it successfully while working with productions in extreme locations.
Supporting the Director
As well as giving filmmakers control of the camera in the virtual space, rapid turnover of iterations is a further advantage of ARENA. Matthew and Tat Radcliffe would frame up shots themselves, or describe what they wanted to see to Janek, who would then capture the shots for them. Matthew was able to make technical changes on the spot, such as camera angles, but he was also able to adjust the flow of the narrative to create maximum impact. In other words, it was supporting his job as director.
“We were pleased that Matthew, our director, was fully invested in ARENA,” said Janek, “It made the whole process run very smoothly. The director and DoP took to the ARENA sessions, both live and virtual, really well. As a result, some of the sequence was restructured to help amplify the sense of double and even triple jeopardy thanks to the insight that was gained during the sessions.
The collaborative work between the Director, DoP, VFX Supervisor, Previs and Real-Time Supervisors, allowed multiple, complex issues to be resolved at decisive moments, while creating the elaborate cinematography required to do the story justice. For example, a critical challenge for the sequence, ahead of anything else, was conceptualising how to shoot a scene that was supposed to be happening 50ft in the air and also included complex, dynamic acrobatic feats.
To begin previs creation, the acrobatic performances were roto animated from video of choreographed trapeze artists and placed into a real-time engine. This step was crucial to giving the Director, DoP and VFX Supervisor a sense of the spaces they had to work with, and the geography of what would be seen – that is, the pool as viewed from a height, the sharks moving and snapping and so on.
Using ARENA’s proprietary virtual camera, they established the angles that worked best, the scale of the environment and how the lighting should be set up. Once cut together, the shots they created became the blueprint or working edit for the sequence in terms of action, edit, look and lighting. Subsequently, as the DoP and Director's vision, it was also used as the blueprint for the final cut and distributed throughout the departments.
Within the virtual environment, the key members of the production were able to take their camera ‘down there’ in the space with the crowd and then ‘up there’ with the Acrobat, using the virtual camera to shoot from all around the environment, choosing the best sides to shoot from at particular moments, all the while looking up and down through the camera.
The Director wanted wider lenses to build up the jeopardy of the Acrobat’s dilemma and to reinforce the feeling that the Escapologist would NOT be able to catch her. This helped to drive the narrative of the story and was important to both establish the connection when the characters see each other, but also feel the sense of peril in their distance from each other.
The virtual camera process was integral to establishing the right lenses to exaggerate that distance. Experimenting with different lenses, different looks, irrespective of the animation, was helpful for layout purposes and ensuring the pool was the right scale and maintained an element of threat. Ultimately, some of the shots did faithfully reproduce what was created in the virtual camera.
Despite the numerous terrifying elements in the sequence (sharks, TNT and high wires) the most challenging aspect of the work for the filmmakers was completely unexpected – previs was in full flow just as the pandemic struck in 2020 and the team had to quickly move to remote operation (virtual) for ARENA sessions and feedback and approvals. Later they were able to switch between live and virtual depending on the COVID status at the time.
“This was the first time that ARENA was used in a remote capacity, and it was incredibly successful,” said Janek, “We were able to keep the show running when so many were shut down during those early COVID days. Each session took a couple of hours at a time, going through 60 to 70 shots per session from which selects were taken, a process that traditionally could take up to a week each time.
“During COVID, everyone other than the ARENA operator on the Unreal computer worked remotely. That operator worked in the office behind all of our security. Then we would all jump onto a video conference call, including the operator, who would share their screen displaying the Unreal scene. ARENA on the iPad would then be deployed – everyone would see the same images from the virtual shot camera and discuss lenses and angles, allowing us to create shots together.”
The ARENA iPad VPN could actually be set up on several iPads at the same time, so that if the DoP or Director wanted to take control of the camera, one iPad VPN would be disconnected and the other turned on with a couple of clicks. The camera in the Unreal scene would then be driven by the new user on the new connection, even when users and devices were in completely different places or even different time zones.
As well as the previs created for the thrilling Acrobat’s story, NVIZ also designed postvis for various scenes, including a menacing sequence in which the Escapologist appears to come to life as a Chain Man. NVIZ’s Head of Postvis Richard Clarke said, “The production wanted to work through the staging, posing and key performance moments of the creature throughout the sequence.
“Speed was of the essence to complete the work in time to inform the edit. Because the Chain Main does not have a solid form, alongside the animation work, we looked at how CG lighting could be used to help the audience read the performance, which was critical in such a fast moving scene.” nviz-studio.com
Words: Adriene Hurst