Pulling together the many layers of assets, lights, explosions and visual effects that make the fantastic universe of ‘TRON: Legacy’ was no small task. To achieve the director’s vision, Compositing Supervisor Paul Lambert at Digital Domain worked with a team at The Foundry and artists from numerous vendors. One vendor was Prime Focus, who sent the film’s heroes out across the digital grid landscape in the massive Solar Sailer.


The shoot took place from March to August 2009 in Vancouver, where VFX Supervisor Eric Barba from Digital Domain worked on-set with a small integration team. ‘TRON: Legacy’ was Paul’s first stereo project, and one of the first movies made on the Pace rig after ‘Avatar’. The stereo work felt new to everyone at Digital Domain, where they have worked on dimensionalised projects before TRON but not one that was shot in stereo.

Camera Testing
As work got underway, Paul and his colleagues were also aware of the lack of experience in working on stereo projects across the industry. “Even the camera the production used, the Sony F-35, was still new when the shoot started and needed extensive testing,” Paul said. “The cameras had to be prepped for stereo work. We made tracking tests, tested the colour between plates and got used to looking at stereo footage.”

This film was significant for Digital Domain because they were working with the production and director from the very outset, carrying out previsualistion and running the Art Department in-house. “Digital Domain has never had that level of access to a production before,” said Paul. “We have always worked as vendor under an on-set VFX Supervisor hired by the production. But on this project, our VFX supervisor Eric Barba took that role, giving us much more contact, and everyone found working this way very enjoyable.”

The Digital Domain team completed 17 of the film’s sequences including major set pieces for the Disc Game, Light Bike Battle, Light Jet Battle and the portal sequence at the film’s climax. They also set up the looks on all sequences outsourced to other vendors, where the teams would develop them further following the rules and guidelines for the show, which were specific and precise. Consistency was crucial for this film and because about half the film’s shots had to be outsourced, maintaining control was a major task.

Follow Focus
Digital Domain’s compositing pipeline, and that of the other vendors working on TRON, was based on Nuke, which was used to develop several tools that helped the different teams keep the film’s look very consistent. One of these was a follow focus system. All focussing was done in the composite. Because of the dark environments, Cinematographer Claudio Miranda and the director Joe Kosinski had shot the TRON footage wide open at about T1.3, so all footage had a very narrow depth of field that all teams had to match in their CG work.

To use the follow focus system, they would input their camera data, the CG from two different elements and, for example, information from the animation about where that camera should be. From there, they could work out from the plate how to correctly focus their work in stereo space, which helped give the CG an integrated, realistic look.

They handed out plug-ins to do all flare work as well, because the film features a highly stylised flare with scan lines, controlled by angles of light. Such tools were developed on the actual shots, and by the time they sent them out, they were complete pieces of software, fairly intuitive to use. For example, to add a flare to a shot, the plug-in would use the location of the other lights in the shot to produce the appropriate ‘TRON flare’.

Another tool helped produce the correct glow lines on each character’s suit, reflecting the particular colour palette associated with his identity. These lines often had to be enhanced to look brighter and show the correct colour and flicker frequency. “For some things like pulling the mattes to add the glows, there wasn’t a prescribed procedure,” Paul explained. “Sometimes the glow could be keyed, but in other instances it had to be rotoscoped. The suits sometimes failed and the light would go out, which meant it had to be rotoscoped back in.”

Stereo Workflow
As a compositing supervisor, Paul said that developing a stereo workflow was probably his biggest challenge on TRON. “We worked so that one person would complete the whole shot for both eyes, first finalising and rendering the flat shot for the left eye and then the right, so that we would have the complete stereo version within about a week.

“Throughout, we were working out what we could and couldn’t do in stereo compositing and tricks that we could get away with. We needed to gain a clearer understanding of interocular convergence and were learning how to compare one eye with the other to create a better image. Some people have more natural skill at this, while others have to learn and can get headaches at first. Double checking every shot for consistency is time consuming but critical.”

The action in the TRON world was played out over a clearly defined area. When you looked out over a wide area, for example in the Light Bike sequence, you would see a view in which every asset had to be located on the correct coordinates, so that when you rotated the camera around you would still know where you were in that space and what you would see. Likewise, on the Solar Sailer’s flyover sequence, because the characters were travelling so far, they had to actually pass by places such as the stadium or the arena, all positioned in the correct locations on the grid.
Digital Domain’s layout department worked out this coordinate system. While the Prime Focus team could use real world units due to the scale of their work, Digital Domain used a scale of 10:1 because of the precise head integration work they did for the CG characters Clu and Tron, or Alan Bradley, working in decimetres instead of metres.

Compositing Challenge
Paul was one of two Compositing Supervisors on the project at Digital Domain, working alongside Sonja Burchard, each with a team of 20 to 25 compositors. From a compositor’s point of view, this film demanded a lot of keying to extract the characters accurately from blue screen and integrate them into this all-glass environment. Paul is very satisfied with the result. The work has earned a VES nomination for Outstanding Compositing in a Feature, and their work on the Disc Game sequence has earned another for Outstanding Created Environment in a Live Action Feature.

“The many optical effects were also a major factor in integrating live action with CG layers. All the optics created for the light bikes – moving into camera, turning, twisting, jumping – were done in the composite, slightly stylised of course but they added tremendously to the real-world feeling we achieved in the shots.

“In the Light Jet sequence, flying in a massive environment with atmosphere everywhere, we were trying to composite everything together, keeping the glows and interactivity in place with bullets flying everywhere as well. We wanted to give a believable look to tracer fire and exhaust systems. All the tiny bits of optics really helped us finesse the CG to look as real as possible.

Z-depth Glows
“To the Light Jets’ trails going through the fog, we would add interactive glows, and we also developed a plug-in to create glows based on z-depth. Rather than putting this through the render, which would have taken days and days, this plug-in would create glows of different sizes based on their z-depth and then drive the interactivity into the atmosphere, explosions and tracer fire passing through smoke. These effects could all be done in compositing.”

The climactic Portal sequence involved hundreds of layers and passes. It had the Light Jet environment as the background, plus clouds, the sea and atmospheric passes, then the bright portal itself composed of various elements, and the bridge. Adding the CG Jeff Bridges added another full system of passes – it all added up to around 150 passes all together.

“Across TRON, the environments were top lit with quite flat lighting, which was intentional of course but did make our head integration work more difficult. We had found on ‘Benjamin Button’, where we had similar work, that if the lighting was fairly moody with identifiable point sources of light, it actually made it easier to get everything working together, while the exterior shots took longer because the light was flatter without a direct source,” Paul said.

“However, all of TRON was made this way – no moon, sun or time of day. This was challenging at the beginning of production when we tried to define the Light Jet world, our first sequence. Having no specific source of light made shaping the planes, textures and surfaces in the environment more difficult.”

Head Integration
Digital Domain completed all of the integration work on the heads of the two CG characters, which meant sharing shots and sequences in many cases with their outside vendors. Their head pipeline was the same as the one they had developed for ‘Benjamin Button’ in 2008. Any shot requiring a head replacement was shot with witness cameras as well as the main camera. Just after the actual take, their integration team would capture the HDR of the environment, giving them the precise lighting of that take.

Back at the studio, before a shot was assigned, they would assemble this HDR and break it down into its specific geometry. The team would also have surveyed the set, taking all measurements, collecting textures and pictures of all the lights to be able to rebuild that scene in Nuke, which they would test for correctness.

Once satisfied, they would publish the information out to the lighters, so they had all the data they needed for the lights cut out from the HDRI and put onto cards with all the associated geometry and could light the head correctly according to the shot for a photoreal result. This pipeline has now been used on several projects at Digital Domain.


TRON Rules
Prime Focus was among the last officially awarded vendors on TRON and started work in February and March 2010. Their brief combined specific demands with opportunities to contribute to the design and look development of their assets and shots. The way the world of TRON was built and looked was critical to the production and all vendors had a very clearly defined set of rules to follow.

In some cases, the team had plenty of concept art. For others, there was next to none and they needed to flesh out their shots according to the established rules. Because Digital Domain had been directly with production from the start, Prime Focus' artists could focus immediately these tasks instead of first undertaking an extensive R&D and concept stage as they normally would.
Prime Focus’ work centred on a 13 minute sequence, starting at the top of a tower at the End of Line Club from where the characters step into an elevator that suddenly plummets straight down to the base of the tower to a space called the Sub Level. From here, they take off in the Solar Sailer on their journey across a vast plane for which almost all shots required mixing live action with CG, plus some all CG shots.

Solar Sailer
Their major asset, the Solar Sailer, was based on the Digital Domain Art Department’s concept, which they could revamp, paint over or give more detail to. Similarly, the environments started with initial concepts, but they added more style frames and had the freedom to work on further concepts, exchanging ideas with Eric Barba, the production VFX Supervisor from Digital Domain. Most shots involved live action plates and within the world of TRON, the actors were always shot on blue screen. For example, in the Solar Sailer shots, they would be sitting on a platform on a small or partial set, which the artists would extend or attach to the main part of the ship.

They were given three pieces of concept art for the Solar Sailer, along with a rough model used for previs. What its design needed was what the production called richness – subtle detailing to not only communicate the practical, technical nature of all elements inside TRON but also its large scale, a quality the base asset lacked.

“While there were several practical set pieces for the Solar Sailer, no single, master set piece was made,” said DFX supervisor Jon Cowley. “As result, portions of the set, for example sections of the cargo pods, were shot live action that we, in turn, had to match in CG on our platform-only set. Other parts of the vessel like the sails only existed as concept art and we had the flexibility and freedom to further develop the look, movement and energy pulses associated with them.” Modelling for the ship was done inside Maya and Mudbox, and at times with ZBrush.

Speed and Distance
Assets were built at a real-world scale, and so the distance and speed at which the Sailer travelled was also real. The primary reason for this was working in stereo. It’s difficult to ‘cheat’ a 3D volume. The total distance it flew through the sequence was approximately 60 miles, flying over a very large piece of land that the team had procedurally generated, on which they could move different hero landscape assets.

The brief stipulated that the ship should move very fast, on an essentially straight line at a constant speed, and at an altitude of 10,000 feet. At this height, of course, objects on the ground scarcely appear to be moving, and they needed a means to create more interest and a sense of motion and speed. They added some flex to the sails, for example, but what was most effective were the quantity of atmospheric elements the artists layered into the surrounding space. In the end, a huge number of these atmospherics was required, and creating and managing them became a major part of their work.

Adding atmospherics meant dealing directly with stereo requirements. The team couldn’t employ some of the 2D techniques typically used for atmospherics, such as putting footage of clouds or smoke on cards. “In a volume of this size, given the amount of depth we had to convey, everything had to be 3D CG and had to be rendered,” said Jon. “It’s amazing what can happen. When something like a cloud is completely out of focus off in the distance, if it isn’t sitting in the in the proper spot in stereo space, you’ll suddenly have a soft fuzzy cloud popping off screen or sitting too far into the background.”

Cloud Team
They had to create every single cloud and every passing wisp of vapour from the engine or off the edges of the sail. A special team was devoted purely to making these atmospherics, especially clouds and cloudscapes, art-directing their looks. Software developer Matt Fairclough at Terragen, the environmental effects application they were using for clouds, came on board to help write custom code for this part of their work, producing tools to help fill the sky environments.

For the terrestrial landscape, they wrote they own procedural landscape generator that obeyed the TRON rules. For example, the rules about angles dictated that all objects would be constructed with 30°, 60° and 90° angles, like the construction of a computer chip. Their procedural toolset was built accordingly and used to construct most of the environment, into which they would drop ‘hero’ sculpted pieces wherever required. “This tool had to be made quite show-specific for TRON,” said VFX Supervisor Chris Harvey, “but no doubt we’ll recycle the code at some stage for use on another project.”

Digital Domain supplied various shot references from other sections of the film plus the concept artwork for the environments. The shots supplied for reference were various hero shots that illustrated the style for the film and at times showed bits of landscape. Apart from that, the team had three rules to follow. First was the direction of travel, a straight line toward the portal, as dictated by the story. Second was the 30°-60°-90°-angle rule. Third was the layout of a V-shaped canyon, composed of 30° angles that the Solar Sailer entered on its way to the portal.

Stereo Cameras
They set up stereo cameras – in Nuke, Maya, Terragen, and 3dsMax – all exported from the same matchmoved cameras in Syntheyes using the various camera data and plates received from production. Prime Focus had their own proprietary camera file format used for data interchange so they used this to move the data back and forth among the various software packages used. This way, CG shots created in Maya could be exported to Nuke and Terragen.
“Our reliance on several different software packages, having to bring assets and animation back and forth between applications, and the need for pixel perfect stereo accuracy, meant that we had to bullet proof our stereo camera pipeline. All of our cameras were generated and versioned out of Syntheyes and allowed us to export heavy geometry sets from Maya to Terragen for hold out geometry and so forth,” Jon said.

Skies and Cityscapes
TRON city’s look was quite well established by the production in the concept work. Paul Lambert explained his team’s technique for cityscapes. “We used a paint projection technique for anything close up. We would paint what the scene would look like from an angle from the shot and then project that onto the geometry so that it would work for other angles.”
Prime Focus had a number of shots to create of the city at a further range, as well as several exterior shots of the nightclub tower. Again, because this film was made stereoscopically, the cityscapes couldn’t be approached as conventional matte paintings. They shared assets with Digital Domain and augmented them accordingly for their shots as needed. In some cases, Digital Domain's shots didn't require certain angles or portions of the city so they needed to build these from scratch. Areas like TRON's suburbs also had to be designed and built.

Dealing with the fundamental changes required to produce matte paintings for stereo movies was a challenge. Jon explained that nothing can be cheated or approximated, all geometry had to exist and be created, modelled and assembled to occupy 3D space. They used the paintings for a lot of the very large assets because it was easier than trying to texture and render every detail.
They also produced extremely large matte painting sky domes over the environment the ship travelled through. The sky was extensively art directed to have a particular look over the city and also over the portal. The volumetric clouds were layered up over these deep skies.

Freefall
The freefalling elevator part of the sequence was fast and furious, lasting only moments. It included close internal shots showing the characters scrambling to stop the fall, and cut to wide exterior city shots of the elevator falling down the tower. In itself it wasn’t so challenging but the artists needed to correctly handle the spatial and stereo aspects of showing speed, the exteriors and the clouds rushing by, and put a lot of effort into just a few moments.

Jon said, “Stereo can throw a wrench into a shot that should otherwise be pretty straightforward. For example, in this case, the glass in the elevator needed reflections added to it in post, which normally isn’t so difficult. But these had to be proper reflections, receding in depth in stereo.”

Single Light Kit
For their extended travel sequence the team benefitted from the fairly consistent lighting environment. “Our only really tangible light source was the beam that the ship was riding upon. Lighting without a single, sun-like light source actually made the sequence easier in the long run, once the strategy was correctly set up. Only a single light kit was required that we could use throughout. For the interiors, the ship was self illuminating and also lit the characters, unlike a normal set where you have to keep track of a key light and fill lights for the action. Once we had the lighting in place, they could stick with it.”

However, getting it right in the first place took some effort – it wasn’t simply a matter of matching their lighting to an existing plate, but had to be consciously art-directed like the clouds. Chris said, “An analogy we used early in production to capture the look was that TRON should have some characteristics of a stylish car advertisement. Our goal became creating that feeling of light emanating from the beam of light they were traveling on, and the impression of light from the direction of the portal and city, paying particular attention to the artistic nature of the shot and yet trying to maintain and believable and consistent lighting direction.”

One of the more difficult lighting developments was establishing the set-up for all clouds and environments. They wanted to ensure that they could tweak and control this lighting further along the pipeline in Nuke. Because it would be affecting so many shots, they especially didn’t want to commit themselves to an approach that couldn’t be adjusted in the composite to suit a special request from the director or VFX supervisor. They wanted to avoid going back to extensively relight shots.

So they broke down their set-up into an RGB-type of pass. Jon explained, “The convenient aspect of this uniform lighting was that, across the wide expanse, we could limit the sources we had to deal with to the light they were travelling toward, the light they were coming from, and the generic fill lights used for our atmospherics and clouds, that were being created by the beam. That gave us three basic light sources, just right for an RGB system.”

Render Economy
Completing those 13 minutes of the film actually meant generating a total of 26 minutes of left and right eye footage, and took a massive amount of rendering. Render passes varied dramatically in number. “A simple shot with one or two actors might have, for example, nine cloud passes, the sky background as part of the Nuke build, and three different landscape passes, all repeated for the second eye,” said Chris. “Conversely, one of our biggest shots was a 180° camera move – first looking at the entire cityscape and then looking out at a vast landscape, becoming a massive shot with hundreds of layers including our hero Solar Sailer asset flying right by the camera. This entailed many reflective surfaces such as glass pieces on the cargo pods. One frame needed 30 hours of render time.”
Atmospherics were never ‘cheap’ to render either. Stereo frequently needs much more detail to fill spaces. Artists filling in crowd scenes often have this experience. Shooting with a long lens on flat film lets you pack detail into spaces fairly quickly, but being able to see and feel 3-dimensional volume requires much more data.

Checkerboard Render
Although facilities have different approaches to reducing render time, Prime Focus would complete and get approval on left-eye footage first and then render the right eye afterwards. To be sure that they were handling the stereo correctly from the start, they had a full layout department who would track the live-action or all-CG shots and render them, left and right eyes, with a simple checkerboard environment. These were sent to Digital Domain to sign off specifically on the stereo and composition.

From that stage, they would work on the left eye and fully render it in CG, and then the right eye, also fully rendered in CG. Jon watched the render process carefully. “Within the 13 minutes of material, a particular render might not come off the farm for three or four days, Jon said. “It may look great but, for some reason, have required 20 hours per frame. The right eye now needs rendering, but that render time is not really acceptable within your workflow. So you have a dilemma – do you stop and figure out where the settings could be improved and render the left eye again or, because it looks good, just retain the settings and render the right eye the same way? You can’t change the settings for the right eye only. They must be identical.”

Data management to keep track of the corresponding passes and layers for the left and right eye of each shot was another substantial undertaking. They wrote special tools to manage these details, freeing up the lighters and compositors so they could concentrate on getting the left eye to look as good as possible. Only a small dedicated team – consisting basically of Jon! - had to worry about the right eye. These tools, or interfaces, allowed him to control the process with out having to open Nuke or Maya or revisit the scenes to track what was generated or being used in the composites, and match the right eye to this.

New Tools

Digital Domain chose V-Ray as the renderer that all facilities on TRON would use, mainly for consistency and tool sharing. Fortunately it was already the render application Prime Focus uses in its pipeline and had used on many previous projects.

Another mandated software was Nuke. In this case, Prime Focus had not been using Nuke in their pipeline. They had been a Digital Fusion studio, and found they had to change their entire 3D pipeline for TRON, involving a considerable integration process and porting several tools over to work with it. However, Nuke already has many good stereo tools that made some of their in-house tools redundant, avoiding the porting problem.

Because Digital Domain supplied the different Nuke-specific ‘show-look’ tools, like the glow lines tool mentioned above, as well as stereo tools, everyone was working on common ground. Prime Focus especially liked OCULA’s tools for dealing with stereo anomalies. “A high-contrast show like TRON could have become even more demanding for us, especially regarding light polarization and reflective surfaces that potentially create stereo issues,” said Jon.

Life in Stereo
Chris had completed a number of stereo film projects before. ‘Journey to the Centre of the Earth’ was his first, when techniques and tools were still quite new and proprietary, and became a huge stereo learning and discovery process requiring trial and error. There was almost no one to consult at that time. His skills were further refined on Avatar, when he did some consulting, and on Prime Focus’ conversion work. Despite his experience with stereo he resists thinking of himself as an expert because he feels that stereo in the industry itself is still in its infancy, and that we all have a lot more to explore.

On the other hand, this was Jon’s first stereo show. “On my first day on the project, I sat in the theatre with Chris looking at shots while he pointed out all the details we would need to address – but I couldn’t see any of them. It was a visual blur to me. So I’d say that the first hurdle on stereo projects is helping the team to become more ‘stereo savvy’ and able to identify issues within the images. You can’t do this alone or by reading books on the subject. You need a mentor to walk you through visually.

“The difficulty of learning to work in stereo can be underestimated – it does change the work of a VFX artist. Stereo also needs an experienced DP on set to shoot the blue screen and other VFX shots for the team to work on. Some aspects of stereo footage can be corrected in post, but others cannot be.”

ColourMatcher R&D
A major stereo issue in the post production arose because one camera on a mirror rig shoots through a mirror and the other shoots a reflection from a mirror. “Reflected light is naturally polarized, so the effect creates differences between the left and right images,” Paul explained. “When you have a scene with light sheens and reflections, these will be noticeably less evident in one eye than the other and when viewed together in stereo, the result is disturbing and unnatural to the viewer.

“Polarization issues emerged in every shot containing on-set reflections. Unfortunately, not only is the world of TRON a very shiny place, but it also features characters wearing shiny black suits. The ColourMatcher plug-in for Nuke from The Foundry’s OCULA package enabled us to take the colour information from one eye, analyse the same information in the other eye and correct the pair by making the colours consistent across both images.

“We worked with The Foundry over a few months to help refine ColourMatcher, and it worked out well for TRON. It’s not a ‘plug-and-play’ kind of tool but we were able to successfully hand the set-up that we had developed from the original software to the outsourced teams on the project. When they ran it, it would produce four different outputs, each tuned for different tasks – one for details, one for adjusting focus, one for large gradients and so on, so there wasn’t a single fix that addressed everything. They would generate these outputs and make a collage of the best parts of each, creating the other eye and overcoming the polarisation issues. Once it was running it was fairly straightforward.”

For v2.1 of OCULA , The Foundry incorporated their set-up and consequently the plug-in now consists of only a few nodes instead of hundreds, making it quicker and easier to get a correction. Paul still feels it’s necessary for users to go through the process of generating the various outputs and collaging them together themselves, instead of going for a one-click procedure.
“In working out these colour issues The Foundry really pulled through for us,” said Paul. “Their earlier version of ColourMatcher didn’t work quite as well, missing some of the fine detail and not handling edges so well, but by working with them and sending them TRON footage so that they would understand the problems, they have made a better plug-in that we could customise for TRON.” Polarization something that every show using a mirror rig will experience.ALSO READThe Foundry Contributes R&D to TRON: Legacy

Words: Adriene Hurst
Images: Courtesy of Walt Disney Pictures
MORE FEATURE ARTICLES
Featured in Digital Media World. Subscribe to the print edition of the magazine and receive the full story with all the images delivered to you.Only$79 per year.
PDF version only $29 per year
subscribe