TRON Rules
Prime Focus was among the last officially awarded vendors on TRON and started work in February and March 2010. Their brief combined specific demands with opportunities to contribute to the design and look development of their assets and shots. The way the world of TRON was built and looked was critical to the production and all vendors had a very clearly defined set of rules to follow.

In some cases, the team had plenty of concept art. For others, there was next to none and they needed to flesh out their shots according to the established rules. Because Digital Domain had been directly with production from the start, Prime Focus' artists could focus immediately these tasks instead of first undertaking an extensive R&D and concept stage as they normally would.
Prime Focus’ work centred on a 13 minute sequence, starting at the top of a tower at the End of Line Club from where the characters step into an elevator that suddenly plummets straight down to the base of the tower to a space called the Sub Level. From here, they take off in the Solar Sailer on their journey across a vast plane for which almost all shots required mixing live action with CG, plus some all CG shots.

Solar Sailer
Their major asset, the Solar Sailer, was based on the Digital Domain Art Department’s concept, which they could revamp, paint over or give more detail to. Similarly, the environments started with initial concepts, but they added more style frames and had the freedom to work on further concepts, exchanging ideas with Eric Barba, the production VFX Supervisor from Digital Domain. Most shots involved live action plates and within the world of TRON, the actors were always shot on blue screen. For example, in the Solar Sailer shots, they would be sitting on a platform on a small or partial set, which the artists would extend or attach to the main part of the ship.

They were given three pieces of concept art for the Solar Sailer, along with a rough model used for previs. What its design needed was what the production called richness – subtle detailing to not only communicate the practical, technical nature of all elements inside TRON but also its large scale, a quality the base asset lacked.

“While there were several practical set pieces for the Solar Sailer, no single, master set piece was made,” said DFX supervisor Jon Cowley. “As result, portions of the set, for example sections of the cargo pods, were shot live action that we, in turn, had to match in CG on our platform-only set. Other parts of the vessel like the sails only existed as concept art and we had the flexibility and freedom to further develop the look, movement and energy pulses associated with them.” Modelling for the ship was done inside Maya and Mudbox, and at times with ZBrush.

Speed and Distance
Assets were built at a real-world scale, and so the distance and speed at which the Sailer travelled was also real. The primary reason for this was working in stereo. It’s difficult to ‘cheat’ a 3D volume. The total distance it flew through the sequence was approximately 60 miles, flying over a very large piece of land that the team had procedurally generated, on which they could move different hero landscape assets.

The brief stipulated that the ship should move very fast, on an essentially straight line at a constant speed, and at an altitude of 10,000 feet. At this height, of course, objects on the ground scarcely appear to be moving, and they needed a means to create more interest and a sense of motion and speed. They added some flex to the sails, for example, but what was most effective were the quantity of atmospheric elements the artists layered into the surrounding space. In the end, a huge number of these atmospherics was required, and creating and managing them became a major part of their work.

Adding atmospherics meant dealing directly with stereo requirements. The team couldn’t employ some of the 2D techniques typically used for atmospherics, such as putting footage of clouds or smoke on cards. “In a volume of this size, given the amount of depth we had to convey, everything had to be 3D CG and had to be rendered,” said Jon. “It’s amazing what can happen. When something like a cloud is completely out of focus off in the distance, if it isn’t sitting in the in the proper spot in stereo space, you’ll suddenly have a soft fuzzy cloud popping off screen or sitting too far into the background.”

Cloud Team
They had to create every single cloud and every passing wisp of vapour from the engine or off the edges of the sail. A special team was devoted purely to making these atmospherics, especially clouds and cloudscapes, art-directing their looks. Software developer Matt Fairclough at Terragen, the environmental effects application they were using for clouds, came on board to help write custom code for this part of their work, producing tools to help fill the sky environments.

For the terrestrial landscape, they wrote they own procedural landscape generator that obeyed the TRON rules. For example, the rules about angles dictated that all objects would be constructed with 30°, 60° and 90° angles, like the construction of a computer chip. Their procedural toolset was built accordingly and used to construct most of the environment, into which they would drop ‘hero’ sculpted pieces wherever required. “This tool had to be made quite show-specific for TRON,” said VFX Supervisor Chris Harvey, “but no doubt we’ll recycle the code at some stage for use on another project.”

Digital Domain supplied various shot references from other sections of the film plus the concept artwork for the environments. The shots supplied for reference were various hero shots that illustrated the style for the film and at times showed bits of landscape. Apart from that, the team had three rules to follow. First was the direction of travel, a straight line toward the portal, as dictated by the story. Second was the 30°-60°-90°-angle rule. Third was the layout of a V-shaped canyon, composed of 30° angles that the Solar Sailer entered on its way to the portal.

Stereo Cameras
They set up stereo cameras – in Nuke, Maya, Terragen, and 3dsMax – all exported from the same matchmoved cameras in Syntheyes using the various camera data and plates received from production. Prime Focus had their own proprietary camera file format used for data interchange so they used this to move the data back and forth among the various software packages used. This way, CG shots created in Maya could be exported to Nuke and Terragen.
“Our reliance on several different software packages, having to bring assets and animation back and forth between applications, and the need for pixel perfect stereo accuracy, meant that we had to bullet proof our stereo camera pipeline. All of our cameras were generated and versioned out of Syntheyes and allowed us to export heavy geometry sets from Maya to Terragen for hold out geometry and so forth,” Jon said.

Skies and Cityscapes
TRON city’s look was quite well established by the production in the concept work. Paul Lambert explained his team’s technique for cityscapes. “We used a paint projection technique for anything close up. We would paint what the scene would look like from an angle from the shot and then project that onto the geometry so that it would work for other angles.”
Prime Focus had a number of shots to create of the city at a further range, as well as several exterior shots of the nightclub tower. Again, because this film was made stereoscopically, the cityscapes couldn’t be approached as conventional matte paintings. They shared assets with Digital Domain and augmented them accordingly for their shots as needed. In some cases, Digital Domain's shots didn't require certain angles or portions of the city so they needed to build these from scratch. Areas like TRON's suburbs also had to be designed and built.

Dealing with the fundamental changes required to produce matte paintings for stereo movies was a challenge. Jon explained that nothing can be cheated or approximated, all geometry had to exist and be created, modelled and assembled to occupy 3D space. They used the paintings for a lot of the very large assets because it was easier than trying to texture and render every detail.
They also produced extremely large matte painting sky domes over the environment the ship travelled through. The sky was extensively art directed to have a particular look over the city and also over the portal. The volumetric clouds were layered up over these deep skies.

Freefall
The freefalling elevator part of the sequence was fast and furious, lasting only moments. It included close internal shots showing the characters scrambling to stop the fall, and cut to wide exterior city shots of the elevator falling down the tower. In itself it wasn’t so challenging but the artists needed to correctly handle the spatial and stereo aspects of showing speed, the exteriors and the clouds rushing by, and put a lot of effort into just a few moments.

Jon said, “Stereo can throw a wrench into a shot that should otherwise be pretty straightforward. For example, in this case, the glass in the elevator needed reflections added to it in post, which normally isn’t so difficult. But these had to be proper reflections, receding in depth in stereo.”

Single Light Kit
For their extended travel sequence the team benefitted from the fairly consistent lighting environment. “Our only really tangible light source was the beam that the ship was riding upon. Lighting without a single, sun-like light source actually made the sequence easier in the long run, once the strategy was correctly set up. Only a single light kit was required that we could use throughout. For the interiors, the ship was self illuminating and also lit the characters, unlike a normal set where you have to keep track of a key light and fill lights for the action. Once we had the lighting in place, they could stick with it.”

However, getting it right in the first place took some effort – it wasn’t simply a matter of matching their lighting to an existing plate, but had to be consciously art-directed like the clouds. Chris said, “An analogy we used early in production to capture the look was that TRON should have some characteristics of a stylish car advertisement. Our goal became creating that feeling of light emanating from the beam of light they were traveling on, and the impression of light from the direction of the portal and city, paying particular attention to the artistic nature of the shot and yet trying to maintain and believable and consistent lighting direction.”

One of the more difficult lighting developments was establishing the set-up for all clouds and environments. They wanted to ensure that they could tweak and control this lighting further along the pipeline in Nuke. Because it would be affecting so many shots, they especially didn’t want to commit themselves to an approach that couldn’t be adjusted in the composite to suit a special request from the director or VFX supervisor. They wanted to avoid going back to extensively relight shots.

So they broke down their set-up into an RGB-type of pass. Jon explained, “The convenient aspect of this uniform lighting was that, across the wide expanse, we could limit the sources we had to deal with to the light they were travelling toward, the light they were coming from, and the generic fill lights used for our atmospherics and clouds, that were being created by the beam. That gave us three basic light sources, just right for an RGB system.”

Render Economy
Completing those 13 minutes of the film actually meant generating a total of 26 minutes of left and right eye footage, and took a massive amount of rendering. Render passes varied dramatically in number. “A simple shot with one or two actors might have, for example, nine cloud passes, the sky background as part of the Nuke build, and three different landscape passes, all repeated for the second eye,” said Chris. “Conversely, one of our biggest shots was a 180° camera move – first looking at the entire cityscape and then looking out at a vast landscape, becoming a massive shot with hundreds of layers including our hero Solar Sailer asset flying right by the camera. This entailed many reflective surfaces such as glass pieces on the cargo pods. One frame needed 30 hours of render time.”
Atmospherics were never ‘cheap’ to render either. Stereo frequently needs much more detail to fill spaces. Artists filling in crowd scenes often have this experience. Shooting with a long lens on flat film lets you pack detail into spaces fairly quickly, but being able to see and feel 3-dimensional volume requires much more data.

Checkerboard Render
Although facilities have different approaches to reducing render time, Prime Focus would complete and get approval on left-eye footage first and then render the right eye afterwards. To be sure that they were handling the stereo correctly from the start, they had a full layout department who would track the live-action or all-CG shots and render them, left and right eyes, with a simple checkerboard environment. These were sent to Digital Domain to sign off specifically on the stereo and composition.

From that stage, they would work on the left eye and fully render it in CG, and then the right eye, also fully rendered in CG. Jon watched the render process carefully. “Within the 13 minutes of material, a particular render might not come off the farm for three or four days, Jon said. “It may look great but, for some reason, have required 20 hours per frame. The right eye now needs rendering, but that render time is not really acceptable within your workflow. So you have a dilemma – do you stop and figure out where the settings could be improved and render the left eye again or, because it looks good, just retain the settings and render the right eye the same way? You can’t change the settings for the right eye only. They must be identical.”

Data management to keep track of the corresponding passes and layers for the left and right eye of each shot was another substantial undertaking. They wrote special tools to manage these details, freeing up the lighters and compositors so they could concentrate on getting the left eye to look as good as possible. Only a small dedicated team – consisting basically of Jon! - had to worry about the right eye. These tools, or interfaces, allowed him to control the process with out having to open Nuke or Maya or revisit the scenes to track what was generated or being used in the composites, and match the right eye to this.

New Tools

Digital Domain chose V-Ray as the renderer that all facilities on TRON would use, mainly for consistency and tool sharing. Fortunately it was already the render application Prime Focus uses in its pipeline and had used on many previous projects.

Another mandated software was Nuke. In this case, Prime Focus had not been using Nuke in their pipeline. They had been a Digital Fusion studio, and found they had to change their entire 3D pipeline for TRON, involving a considerable integration process and porting several tools over to work with it. However, Nuke already has many good stereo tools that made some of their in-house tools redundant, avoiding the porting problem.

Because Digital Domain supplied the different Nuke-specific ‘show-look’ tools, like the glow lines tool mentioned above, as well as stereo tools, everyone was working on common ground. Prime Focus especially liked OCULA’s tools for dealing with stereo anomalies. “A high-contrast show like TRON could have become even more demanding for us, especially regarding light polarization and reflective surfaces that potentially create stereo issues,” said Jon.

Life in Stereo
Chris had completed a number of stereo film projects before. ‘Journey to the Centre of the Earth’ was his first, when techniques and tools were still quite new and proprietary, and became a huge stereo learning and discovery process requiring trial and error. There was almost no one to consult at that time. His skills were further refined on Avatar, when he did some consulting, and on Prime Focus’ conversion work. Despite his experience with stereo he resists thinking of himself as an expert because he feels that stereo in the industry itself is still in its infancy, and that we all have a lot more to explore.

On the other hand, this was Jon’s first stereo show. “On my first day on the project, I sat in the theatre with Chris looking at shots while he pointed out all the details we would need to address – but I couldn’t see any of them. It was a visual blur to me. So I’d say that the first hurdle on stereo projects is helping the team to become more ‘stereo savvy’ and able to identify issues within the images. You can’t do this alone or by reading books on the subject. You need a mentor to walk you through visually.

“The difficulty of learning to work in stereo can be underestimated – it does change the work of a VFX artist. Stereo also needs an experienced DP on set to shoot the blue screen and other VFX shots for the team to work on. Some aspects of stereo footage can be corrected in post, but others cannot be.”

ColourMatcher R&D
A major stereo issue in the post production arose because one camera on a mirror rig shoots through a mirror and the other shoots a reflection from a mirror. “Reflected light is naturally polarized, so the effect creates differences between the left and right images,” Paul explained. “When you have a scene with light sheens and reflections, these will be noticeably less evident in one eye than the other and when viewed together in stereo, the result is disturbing and unnatural to the viewer.

“Polarization issues emerged in every shot containing on-set reflections. Unfortunately, not only is the world of TRON a very shiny place, but it also features characters wearing shiny black suits. The ColourMatcher plug-in for Nuke from The Foundry’s OCULA package enabled us to take the colour information from one eye, analyse the same information in the other eye and correct the pair by making the colours consistent across both images.

“We worked with The Foundry over a few months to help refine ColourMatcher, and it worked out well for TRON. It’s not a ‘plug-and-play’ kind of tool but we were able to successfully hand the set-up that we had developed from the original software to the outsourced teams on the project. When they ran it, it would produce four different outputs, each tuned for different tasks – one for details, one for adjusting focus, one for large gradients and so on, so there wasn’t a single fix that addressed everything. They would generate these outputs and make a collage of the best parts of each, creating the other eye and overcoming the polarisation issues. Once it was running it was fairly straightforward.”

For v2.1 of OCULA , The Foundry incorporated their set-up and consequently the plug-in now consists of only a few nodes instead of hundreds, making it quicker and easier to get a correction. Paul still feels it’s necessary for users to go through the process of generating the various outputs and collaging them together themselves, instead of going for a one-click procedure.
“In working out these colour issues The Foundry really pulled through for us,” said Paul. “Their earlier version of ColourMatcher didn’t work quite as well, missing some of the fine detail and not handling edges so well, but by working with them and sending them TRON footage so that they would understand the problems, they have made a better plug-in that we could customise for TRON.” Polarization something that every show using a mirror rig will experience.ALSO READThe Foundry Contributes R&D to TRON: Legacy

Words: Adriene Hurst
Images: Courtesy of Walt Disney Pictures
MORE FEATURE ARTICLES
Featured in Digital Media World. Subscribe to the print edition of the magazine and receive the full story with all the images delivered to you.Only$79 per year.
PDF version only $29 per year
subscribe