Senior stereo supervisor Richard Baker Talks about the challenges thePrime-focus-edge-tomorrow2
Prime Focus World team encountered as sole stereo conversion vendor
on ‘Edge of Tomorrow’.


Prime Focus Handles Stereo Conversion for Edge of Tomorrow

Prime Focus World was the sole stereo conversion vendor on ‘Edge of Tomorrow’.Senior stereo supervisor Richard Bakerandsenior vice president of production Matthew Bristoweworked with the production’s stereo supervisor Chris Parks. Richard led Prime Focus World’s stereo teams around the world to deliver all 2,095 shots for the film. Prime Focus Vancouver processed the VFX elements delivered by MPC in Vancouver, the London facility concentrated on the 910 shots involving head geometry, and the team in Mumbai completed the conversion of the remaining 1,185 shots.

Prime Focus uses their own patented tools for stereo conversion, collectively known asView-D, that work within eyeon Fusion and Nuke. They also use tracking software and Maya. Their pipeline teams are continually developing proprietary tools to deal with new challenges they encounter on each project. In particular for this film, the Prime Focus World team was employing its geometry mapping techniques for the lead actors, Tom Cruise and Emily Blunt, and for certain key environments, to ensure consistency and accuracy of depth throughout the intense action of the story.

Prime-focus-edge-tomorrow6


“This was Tom Cruise’s first 3D film, so we needed to show that he was going to look fantastic in stereo,” said Prime Focus World Senior Stereo Supervisor Richard Baker. “This tied in well with our use of head geometry. We use head geo to a level of accuracy that I don’t think any other company can match, and Chris Parks, being a native stereographer, really values that. We were able to show Chris, the director Doug Liman and the studio some very successful early shots of Tom in 3D.”

Wanting to create highly sculpted, accurate faces, they chose to usefacial cyber scans, captured using a 3D laser scanning device and rigged to drive the depth for all of the key actors. Richard said, “We retain a high poly count in our models because I feel that a high level of detail is important. The basic structure is important, of course, but to really achieve accurate stereo the model needs to translate all the subtle curvatures of the face. Using the head geo is especially beneficial with head turns and other movement because all the relationships such as the cheek to nose remain accurate. Details like this make all the difference. As another example, it can be surprising how far back a person’s ears are set, a detail that may only come under scrutiny when we recreate stereo, so knowing the true position helps a lot - we can always adjust if necessary.

Prime-focus-edge-tomorrow


“I also find that actors are simply not used to seeing themselves in stereo, so it can take a little getting used to. ‘Edge of Tomorrow’ was Tom Cruise’s first 3D film, as was World War Z’ for Brad Pitt, and whatever you do to actors with such high profiles is going to come into question. This is why it is so important to make sure it’s correct and looks real. When I say ‘real’ I mean real according to how we see people in real life. Shooting so-called ‘native’ stereo doesn’t mean it’s right, as the depth and lens choices have a strong impact on how the stereo can look. For me personally, I like features and environments to look as I see the world. Human eyesight is the truest 3D there is, so that’s my personal preference.”

The heads were animated using artist-driven tools developed in-house, which allows each artist to carry out the facial tracking and animation. This meant they didn’t need to use a separate team of artists for this stage and kept the development of the shot in one set of hands. Richard reviews all the head animations and gave guidance on the disparity required for each shot. Overall shot disparity, consistency with similar shots, lens and distance from camera are what determines the depth.

Prime-focus-edge-tomorrow2


Owing to the film’s intense action, it was important to ensure a high level of depth and detail in every shot. “In some conversions all the action is placed in the front, and no time is spent sculpting the environment,” Richard said. “It’s something viewers may not immediately notice, but missed details such as this can make the difference between believing a shot or not. Similar to colour, you're not always noticing grading changes, but subliminally your brain takes it all in - it’s part of the overall experience of watching a film.

“Among lots of factors contributing to the design of a shot, lens choice has an impact, as does the amount of action, so it’s about finding consistency in your stereo design while creating immersive, interesting 3D that compliments the integrity of what the director filmed. There are times, with a close up for example, when the choice is not to position the background too positively because that depth can distract from the character in an emotional moment. One advantage of using our tools is that all the decisions can be made in post as the edit comes together.”

Prime-focus-edge-tomorrow7


An interesting sequence shows the characters interacting with ahologram, created by visual effects studio Nvizible in London, that gives them a 3D view of a key location. However, semi-transparent holograms can be difficult to convert, making it preferable to obtain a stereo render if possible, which was what they did in this case. It was a good opportunity to use thestereo camera generation systemPrime Focus World developed for ‘Gravity’.  

The stereo camera generation tool is part of the company’s proprietaryHybrid Stereo Pipeline.  A stereo camera pair is generated from hand-sculpted disparity maps to produce a virtual rig that will work in any CG or compositing environment, allowing the VFX vendor to render CG assets with exactly the right amount of depth for a given slice of the scene.

“Our Hybrid Stereo Pipeline - of which the stereo camera generation tool is a part - is essential to our process due to the difference between the linear mathematically correct stereo generated in during the visual effects process and the non-linear, creatively-driven stereo created in conversion,” Richard said. “It allows us to integrate the two in the same scene, through the generation of a virtual stereo camera pair that the VFX house can use to render the CG assets with exactly the right amount of depth for a particular portion of the scene.

Prime-focus-edge-tomorrow10


“When the tool was developed during our work on ‘Gravity’, it allowed Framestore to render stereo CG elements - such as the pens, helmets and straps that were floating inside the capsules to sell the effect of weightlessness - with volumetric properties that matched our converted scenes exactly. This was necessary because, due to the immensely complicated nature of the production, we were working with live CG assets rather than finals.”

The disparity maps are tailored to produce the depth that the client is looking for in the scene. The cameras are 100 per cent accurate to the depth of the stereo image they have sculpted. Before handing off these cameras, the vendor’s CG objects are rendered through them to QC and check with the stereography department that the objects fit correctly within the converted scene.

They were able to use the stereo camera generation technique somewhat differently with Nvizible on ‘Edge of Tomorrow’ than they had with Framestore on ‘Gravity’. Richard explained, “On ‘Gravity’ we were working with live CG assets due to the nature of the production and the approvals chain. The stereo camera generation tool allowed us to lock our converted scene and deliver this to Framestore with an exact virtual stereo camera pair, so that they could then finalise, render and comp their stereo CG assets into the converted scenes knowing that their volumetric properties would be exactly correct for each scene.

Prime-focus-edge-tomorrow8


“On ‘Edge of Tomorrow’, once the decision was made that the holograms would be better CG-rendered rather than converted, we passed Nvizible the virtual stereo rig we had generated for the scene, and they rendered a right eye of the hologram elements. We then took these new renders and composited the stereo VFX into our converted shots here at Prime Focus.

“It made more sense to do it this way than to bring Nvizible into the stereo approval chain because the details of hologram visual effects are challenging to convert due to the transparencies. The full mesh needs to parallax correctly and have clean negative and positive separation through the movement of the object and shot.”

Prime Focus also further developed its use ofparticle FX for 3Dmoments, adding CG exploding debris and dust to scenes, as well as using cyber scans of set environments to maintain detailed accuracy across a wide range of shots. The use of particles or added debris is designed along with the depth. Often, certain sequences such as the beach battle in ‘Edge of Tomorrow’ will be conceived early on as a scene that would enhance the immersive quality of a movie by having multiple layers of dust and debris added to it.

Prime-focus-edge-tomorrow9


“Also, after a VFX team has finalled a shot, during the stereo stage we sometimes decide that adding debris from explosions into negative space will create some nice 3D moments,” Richard said. “Prime Focus World’s artists have visual effects experience too, so modelling, lighting, rendering and compositing extra elements that match the final VFX shot is something we are able to do. In other words, stereo conversion is not just about making a shot 3D - it has evolved from a roto based process into a full blown VFX pipeline working with matchmove, modelling, geo, VFX assets, set data, LIDAR and, whenever it’s possible, on-set supervision as well.

“The benefit for the VFX house is that their composite can be 2D, and then we can take the layers and elements they provide for the conversion and add our own particles to enhance the 3D effect as required. Pactical particles, for example from practical explosions or flying debris, existing in the plates become still another part of the conversion process. We have tools to extract them to create depth – sometimes we will clean-plate them if they are distracting to the shot, and replace them.”  www.primefocusworld.com

Words: Adriene Hurst
Images: Courtesy of Warner Bros Pictures