DNEG were honoured and excited to push their skill and creativity to the limit for ‘The Matrix: Resurrections’, creating more than 700 shots for the production. The original three films have been the inspiration for so many VFX projects conceived of and made in the nearly two decades that followed their release. They have also inspired countless artists and filmmakers, and finally DNEG’s artists had their own chance to bring that story to life for audiences in 2021.
When DNEG VFX Supervisor Huw J Evans started working on the project in January 2020, the production was already underway at the San Francisco location. Later he joined the production on set in Berlin to supervise the capture of plates for the exo-Morpheus effects. Following the early 2020 start came the hiatus due to COVID. But then production picked up again in August that year, and continued through the New Year to finally deliver in September 2021.
DNEG’s main team was located in London led by Huw, supported by teams in Vancouver and Mumbai led by VFX Supervisor Aharon Bourland. All teams maintained a constant dialogue with the production’s overall VFX supervisor, Dan Glass. Huw also led the VFX photography at times, working directly with the director Lana Wachowski on set.
Morpheus in Particles
The exo-Morpheus effects focussed on visualising the character Morpheus’ ability to use a machine to manifest himself. Lana wanted to maintain a free-flowing look for the character but, as a particle effect, it would have to be built in a way that allowed continuous control and revision to make sure it was both readable on screen and capable of achieving her vision.
Huw said, “We worked on various alternative concepts for the look, and handled readability by making the parts of his body that were moving at any given moment – for example, his face while talking – more solid. The actor performed with the cast on set during the shoot, wearing a head-mounted camera to capture his facial performance and a suit that allowed the artists to track his motion visually to guide the animation. A large array of witness cameras was also placed around the set to record his moves at multiple angles.”
Armed with all of this information back in the studio, they built a full digital double of his body. They used Ziva Dynamics for the musculature because, as well as animate their character to match his performance, they needed to highlight the anatomical simulations and deformation of the muscles. From there, FX supervisors Mike Nixon, Tamar Chatterjee, Tom Bolt and their team were prepared to recreate him as an accurate, detailed particle simulation in Houdini.
The particles themselves were metallic in nature, resembling steel ball bearings. They achieved detail in the simulation by making it only partly procedural, allowing enough art direction to follow the director’s ideas for the look of the motion. The complex lighting on set during the shoot, involving several different point lights, added to the challenge of compositing the thousands of round, metallic particles into the plates. Instead of trying to create reflections precisely referencing each light, they compromised by keeping the overall ‘flavour’ of the lighting, as Huw described it, in the CG.
Unreal Dojo
For a beautiful, entirely digital sequence that takes place in a dojo on an island in the centre of a serene lake, DNEG used an approach they had never attempted before in a feature film. Because it was meant to portray a game sequence within the story, it gave them an ideal opportunity to try using the Unreal Engine for cinema. It involved about 100 shots. By using Unreal, one person could handle and complete the rendering and lighting work relatively quickly, saving the team considerable time and manpower. Digital FX supervisor Robin Beard oversaw and managed this sequence, while CG supervisor Roel Couke focussed on Unreal Engine and communicating with the Epic technical team.
“Work on the sequence was shared with the Unreal Engine team who created the concepts, putting forward the semicircular bridge design. The environments and cameras were created in Maya, working at 4K resolution, and then exported to Unreal. Once in the engine, the resolution was not quite 4K, but later in compositing it could be finessed to give it a higher resolution appearance,” said Huw.
“At the time, Unreal Engine v4.25 was still in use. Throughout production, their team took DNEG’S requests and helped us achieve what we needed, in turn using what was accomplished to enhance their software update work. Examples are OCIO colour support, and the ability to split the images into layers in order to work on them in Unreal.”
The dojo in the centre and some of the surrounding water were built as traditional CG elements so that the building could be exploded, fitting in with the story events, all of which was composited into the shots later. “Destroying the dojo was a fantastic moment for us to work on,” Huw said. “Simulating the large wooden structure splintering and splitting apart, and then splashing down into the water, along with a powerful shockwave causing more destruction, was a complex task.
“We have some great destruction toolsets and setups that we can leverage, so we decided that this work would be best handled in Houdini, then passed to Clarisse for lighting. This would then be composited with the rest of the background generated in Unreal to get the final image. Tree interaction from the shockwave proved an additional challenge, however, as we needed to export the Unreal trees into Houdini to allow us to rip leaves and branches off while swaying the trees and kicking up additional dust as this shockwave blasted through.”
60 Years of Detail
The massive environments DNEG built for the movie are best characterised by their extraordinary level of detail. They contain huge numbers of elements, dense textures, animation and atmospherics extended over vast distances. The Foetus Fields, familiar to followers of The Matrix films, and the new megacity IO, are similar in this respect, although DNEG handled the challenge in different ways for each of them. Like the dojo sequence, USD was useful in several of these scenes as well, in this case to cache out the geometry detail – storing it and then retrieving only as needed so that they would be able to fit it into RAM.
Work on the Foetus Fields began as concept art, and needed to show 60 years of scientific development and of progression in their story. Initially, DNEG attempted to collect the original assets of the Harvesters, Sentinels and foetus eggs, despite the fact that they would be 20 years old. Some were actually recovered, but needed considerable retexturing and topologising. Managing and overseeing the sequences was Digital FX supervisor Steve Newbold, working with the Environment Supervisors Ben Cowell-Thomas and Nigel Wagner.
Huw described the value of going through that process. “While we managed to restore the pod stalk asset from the original movie, we had to completely rebuild it as we needed more detailed geometry and higher fidelity textures and lookdev so that it would hold up to today's standards – and also at 4K resolution. It was a fantastic reference for us, though, and made a great base to make sure we were as close to the original design as possible, before we adapted it with the additional cables and details.”
The environments were comprised of huge towers holding some 18,000 pods each. Houdini was used to build the towers, which allowed them to use USD to accomplish the endless scattering of the foetus pod elements across the landscape. USD contains an instancer that can be used for scalable encoding of vectorized instancing, scattering instances of objects of multiple prototypes. Both the instances and prototypes can be animated.
As they needed to maintain a 4K working resolution, the environment was handled in sections, but in terms of level of detail. That is, close up shots were handled as one section, fly-throughs comprised another section, and an area far below most of the activity, where the roots of the structures were located, were another section. The detail for each of these was built accordingly. Rendering so much detail was a particular challenge, handled by Lighting Supervisor Simone Vassallo.
Living City
The city environment of IO was completely new. The city in the previous films, Zion had been an underground place and relatively small. In contrast, humans and machines had built IO together as a productive, sustainable environment. Instead of a cave it had a false sky – a ‘bio sky’ – supplying oxygen and requirements for agriculture. Therefore, DNEG needed to depict areas of different types of activity like farming, residential life and factories within the one city. Ben Cowell-Thomas on environments and Simone Vassallo on lighting supervised these sequences as well.
The walls were of rock, carved in ZBrush. Large areas of IO could be covered with digital matte paintings to which they added movement with little crowds, and placed specific lights, creating fog and other atmospherics. This environment required an extreme amount of layering, which they worked on throughout production. Instead of approaching it as a procedural build, which would have been very complex and harder to control, they built blocks of action, linked together with logical, visual links. This way, it would be ready to fly the camera through it, to show to the director and decide on camera moves.
Flashback to Destruction
DNEG was also responsible for an interesting sequence of environmental work that was handled quite differently from IO – a flashback moment from the dark past where the audience sees the Machines at war with each other. Huw said, “For IO, we knew we had a whole sequence of shots set in that space with a fly-through, and so designed the environment for those shots accordingly and in much greater detail to allow for this journey. But for the flashback, we knew that although it would be a very brief moment in the movie, it would still require an incredible amount of work and detail in order to sell the story point.
“The brief was to create a post apocalyptic environment showing recognisable signs of human life, with details like a sunken Statue of Liberty and a city edge resembling Battery Park type remnants. It was initially to be shot from a single angle, in which case we could have relied on matte painted elements, but as the complexity grew and new angles were needed, we pushed further into a geometry approach. We still kept it as simple as possible, using as many kitbash pieces as we could and relying on projected textures to help fill out the scene as well as detailed matte painting work to tie it all together.
“We were even tasked with creating some new creatures – the Harvester's didn't feel much like fighters to Lana, so instead the 'Squid Tank' and 'Mini Squids' were born. Reference was taken from sketches by Geof Darrow, concept designer and storyboard artist for all of The Matrix films, fusing insect-like design with machines, and we also borrowed from the Harvesters and Sentinels so they felt of the same world. The Armada ships were another asset that was restored from the original movies, again requiring a complete rebuild. To add to the chaos of the scene, attacking Sentinel swarms were simulated in Houdini along with laser fire and multiple explosions, just to help amp up the spectacle and destruction of it all.”
Out of the Goo
The Anomaleum – the chamber where Neo’s and Trinity’s foetus pods were kept – was a special environment DNEG created. Each pod was attached to a turbine, huge, highly complex pieces of machinery, and Neo and Trinity were also connected to the pods. Portions of this chamber were physically built on set and then DNEG built it up to about 50ft tall. Small microbots scuttle around it – these were further new creature-characters that were also produced from concepts.
“A multitude of cables connected Neo's body to the pod, which we had to create digitally for safety, comfort and ease of shooting. This was no mean feat, however, as we had to individually track each port on his body as they each moved differently according to how his muscles moved or flexed and how his skin slid around,” Huw said. “We had to run a pass of muscle and skin simulation to ensure the ports moved correctly across the skin as he moved and so the tracks would be tight enough that they would look convincing in a closeup at 4K resolution.”
But the foetus pod experience was less than elegant. “Once we had our ports tracked, the cables themselves had to be simulated, and then needed a layer of dripping goo of the right consistency clinging to the cables and dripping down off Neo's body to match the practical footage. Then, of course, where these CG cables enter the practical pod goo that Neo sits in, we needed to do partial pod goo replacements so we could get the surface interaction as he moves around. It all added up to a large amount of detailed work, but hopefully it’s one of those moments that nobody notices and just assumes is practical, meaning we've done our job correctly.”
Friendly Robots
DNEG had the opportunity to create three new robot characters for this film – this time, quite benevolent robots called Synthients that live peacefully with people and helped build IO into a much more livable environment than the previous city, Zion. They also become allies of Neo and Trinity. The Synthients are without facial features but each one has distinctive physical characteristics. DNEG built and animated them digitally - led by Build Supervisor James Guy and Anim Director Keith Roberts - but when they needed to interact with the set and cast, the production used practical stand-ins to help the artists understand how to build and then composite them effectively into the shots.
For instance, Cybebe was a round squashy character represented with a large yoga ball covered in green screen material that Huw manipulated on set. As Octaclese was illuminated, the stand-in actor wore head-torch, and Lumin8 was represented by a maquette.
Morpheus was not the only character that needed a digidouble. Hero doubles were needed for the lead characters Neo and Trinity, among others. The actor’s faces were mainly preserved, although their FACS shapes were recorded, and their bodies were captured with massive camera arrays inside a photo booth, allowing them to be fully reconstructed in any type of shot. www.dneg.com