James Cameron’s Avatar has sparked new ways of thinking about software, hardware
and workflowfor movie making. Take a look at the tools and workflows that helped
make Avatar.from Digital Media World Magazine out now


images: Twentieth entury Fox Film Corporationcopyright 2009 Fox All rights reserved

James Cameron’s ‘Avatar’ has sparked new ways of thinking about software, hardware and workflow for movie making. Despite the look on screen, the movie wasn’t made with magic. Artists pushed the tools in their pipelines to the limit in new ways and combinations, to carry out the performance capture, animation and photorealistic rendering that make ‘Avatar’ a star.

Visions of Pandora
Getting started on a project like ‘Avatar’ was daunting. To begin building James Cameron’s new world required a visually descriptive tool. The art department and Production Designers Rick Carter and Robert Stromberg used Adobe Photoshop to produce the artwork for concept art reels to show the studio heads at 20th Century Fox to convince them of the project’s validity. After Effects was used to place flowing camera moves on the still artwork for the reels to avoid the look of a progression of static stills, and to dissolve between the views.
The artists also used Photoshop for storyboarding and to create very high resolution matte paintings and textures used as the finished artwork passed to the 3D pipeline for CG environments, vehicles and creatures. Thousands of images were taken on-set as lighting and texture references, and artists in the previs department used Lightroom to organize and catalogue them into databases.

Facial Performances

After Effects played other roles. Mike Kanfer, Adobe Business Development Manager, worked with the ‘Avatar’ teams at Giant Studios in Los Angeles where the virtual stage was built, and explained that an important goal during the shoot was letting James see a motion captured actor performing live as his CG counterpart in a virtual environment. Equally important was monitoring the way the live-action actors were interacting with CG characters and elements as they were shot on green screen.
Using Simulcam, James would shoot a take that could be reviewed immediately by doing a quick composite in After Effects, looking at it with the CG characters but including a rough version of their facial performances in position. AE wiped in a shot of their eyes, noses and mouths that had been recorded during the shoot on head-mounted camera rigs. Despite their crude appearance, there was enough information in these composites to tell if his direction and the actors’ performances had been successful, and James could decide whether to approve or re-shoot the take.
David Stripinis was lead virtual camera operator on ‘Avatar’ at Giant Studios in Los Angeles. He described the virtual camera, a proprietary system, as working like an LCD monitor fitted with motion capture markers - Cameron’s means of capturing the performances of actors in motion-capture gear on the virtual stage as if on a live-action set.

 

Virtual Crew
Stripinis and the other virtual camera operators applied motion capture signals from the set to the characters, rigged and loaded into Motion Builder, and then handled any issues as they arose. In order to combine all elements of the shot in real time, the team supplied each scene with digital equivalents of the services available in a traditional movie-set environment from greensman, propmaster and grip to makeup, costumes and of course DP - as well as driving the camera through the virtual set. They also needed to manipulate and even create models sometimes, for which he frequently used Luxology modo and Adobe Photoshop.
The virtual camera they were operating could move with super-real agility within the set. Stripinis noted a major advantage in terms of performance was that the actors could always address and look at each other directly, because the camera could be moved into their line of sight, for example, later. Actors could also deal with and respond to stunt people, kitted out with props as stand-in CG beasts and monsters, however they needed to for the best performance.

Monitors and HUDs
Some vendors hired for VFX work used After Effects to create 3D stereo composites for finished shots, motion graphics for the 3D holographic screens in various control room scenes and heads-up displays for the vehicles in the film. Prime Focus designed displays for the Op Center's Holotable over which the film's main characters discuss their mining plans on Pandora. In one scene, Jake Sully and an officer invoke a three-dimensional hologram of the ‘Home Tree’, where the Na’vi people live.
Using the original live-action plate of a table with a greenscreen across the top, Prime Focus modeled the hardware that went inside the table and the projector beams, and added graphics projected above the table of the terrain, including the Home Tree. These graphics were designed in 2D in Illustrator, animated in After Effects, placed on cards in 3D and rendered in 3ds Max. Prime Focus Software's Krakatoa particle system was used for the 3D terrain, which gave the images a scan-lined LIDAR-like quality. The team developed a custom graphics script called 'Screen Art Graphic Interface', binding After Effects renders to a 3ds Max assembly file.
Pixel Liberation Front designed and animated the monitors and HUDs for the movie’s military aircraft. As reference, their team used recent military aircraft - the F22, F35 Joint strike fighters and Apache Attack helicopters - as a starting point and embellished their designs to fit the needs and dimensions of the aircraft in the film.
Stephen Lawes, creative director at PLF, said, "Each aircraft had a specific design approach that uniquely identified that specific vehicle. The Valkyrie and Dragon were the most complicated and dense due to the curvature, size and quantity of the screens. The AMP suit HUD especially required a lot of attention as it drove a series of story points depicting the distance and size of the Direhorse and Hammerhead charges as an active infrared visual.” They designed the initial screens in Adobe Illustrator for all of these aircraft, and animated them in Adobe After Effects.
For the rough composites, both Weta and ILM sent PLF tracking data that they used in AE to get approval from Cameron, and then delivered the final elements as .EXR files to both vendors. Most of the graphics didn't require their own stereo space, so mono elements were delivered. However, when a couple of close ups did require stereographics, they spatially arranged the graphics in AE's 3D space with two virtual cameras that were derived from the 3D track of the plates.

Stereo Depth
PLF also delivered shots in the shack set and some set-extension composites. Stephen noted that 3D stereo tracks have to be extremely accurate, so they made sure to verify the accuracy of the depth data they were passing from PFTrack to After Effects and Nuke. Lawes described their stereo workflow. “We wrote proprietary tools for AE that made stereo compositing more efficient. The artist concentrated on comping or animating for one eye and then offsets would be automatically generated for the other eye with the correct depth.”
Using these tools, they constructed the military base hallway extensions from a collection of stills taken on-set and patched together with Photoshop’s Vanishing Point function. The completed extension was exported as a .vpe file into After Effects, which gave them a full 3D version. Once the plates were tracked in PFTrack and exported to AE, they could place the CG hallway in the correct stereo depth, and carry out keying and colour correction to finish the shot. But they said they found that even simple 2D shots became extremely complicated in stereo 3D. In all, they delivered about 220 shots.

Editing shortcut
Most After Effects processes can be automated through procedural scripting, Mike Kanfer said. The ‘Avatar’ artists wrote scripts to automate the rendering pipeline for comping the previs shots and especially for the facial ‘mask’ technique, to rapidly prepare hundreds of iterations for the animation department. Premiere Pro supported AE on set, to check rough composites in playback context with animated sequences. A/B comparisons of VFX vendor work could be set up and reviewed.
While the editing for ‘Avatar’ was done on an Avid system, Premiere Pro is able to read in Avid edit decision lists and other metadata using an Avid AAF import feature. This helped save time when updating sequences for the animation department. Instead of having to render out updated sequences, the Avid editor could export the code for the new cut to a small file that Premiere Pro then uses to automatically assemble the cut to match the AVID. Digital video files of the shots that were online in the master shot database were sourced as clips by Premiere Pro. This shortcut was employed especially toward the final stages of post-production.
Acrobat Connect, a teleconferencing software, was used for collaboration during production. It lets an artist look at and control someone’s desktop through an ordinary browser. In one example, one of the Digital Supervisors was called away from a shoot where he was needed to produce the temporary composites that James Cameron used to check the takes. The supervisor used Connect to remotely control the computer of one of the interns on-set, teaching her how to create the comps.

Non-stop Rendering
When HP first deployed BladeSystem solutions for Weta Digital in 2008, they initially had a 10,000 core system. It is now 40,000 cores, with 1,000 cores per rack. The team at HP works with Weta’s IT team, headed by Paul Ryan and Matt Provost. Jeff Healey, HP’s Server, Storage, & Networking Country Manager, says that HP and Weta have a goal to plan ahead of requirements, instead of trying to catch up. They chose the BladeSystem server for speed and low power usage.
Current viewer expectations for VFX mean that every frame a studio like Weta produces is loaded with information and detail. A lot of time is spent planning, choosing products and testing systems. Cost is a major consideration, and the initial price of components has to be weighed against the running cost.
Servers and render farms can run up to 24 hours a day as the pressure to deliver the final film nears, when artists need to see results in as close to real-time as possible, can be up or down-sized for different projects. Keeping the artists productive is a key factor. When demand is heavy, the system needs to be resilient, flexible and cost-effective.

Performance and productivity
Processing performance is always top priority when choosing a configuration, but optimum performance is not a single factor. Performance has to be balanced with memory demand. Currently, Weta runs Intel quad-core dual socket 50 watt processors. An 80w or 120w processor might work faster, but would not cope as efficiently with Weta’s memory demands, and would be much more costly to run. Extra memory quickly becomes expensive – an 8GB system costs more than only double the cost of 4GB, for example
Input/output capacity for the Weta network is the other major factor. It now stands at 1GB to the BladeSystem server, but the IT team is trying to gauge when they should increase it to 10GB, and at what point this will increase productivity.

Stay Cool
After deciding on the configuration, Paul and Matt had to decide on temperature. Jeff thinks their decision to use water cooled racks was wise, because they can control cooling precisely enough to avoid slowing processes without overcooling, which would be expensive. Their job is a balancing act between complexity, rendering power and time, and controlling costs.
In a typical workflow, the artists send frames to the scheduler, who organizes and sends their work to the render wall, which then returns the processed frames to the artist for review. Matt and Paul’s team aim to shorten the time required to complete this cycle.  They are also always improving the management of processing tasks, automating as much as possible to avoid too much manual adjustment.

Benchmarks
Weta make a point of keeping up with new processing and rendering developments, and have actually become influential among developers because they have a good grasp of the demands their artists will face in the future. They are currently testing the HP Generation 6 server, or BL 2X220 G6. Jeff said, “Weta is passionate about technology. It’s a great experience to work with such a competitive group, who all want to be the best and have the best of everything in their work.”
Working in real-time is among their goals, but as the frames grow denser, this grows more difficult. Today’s ‘seamless’ effects also require more processing power. The use of stereoscopic 3D still tends to pressure the artist more than the render wall, but it does make files larger and frames take more time. Weta’s character Golum for ‘Lord of the Rings’ was a benchmark in complexity, but expectations now exceed that level. ‘Avatar’ will become another benchmark.

Video Finishing
Modern VideoFilm in Los Angeles completed the finishing work for 'Avatar'. The entire film and all trailers and promotions passed through Modern's pipeline. The work completed includes conforming and stereo3D checking, adjustment and quality control. The facility also handled all the stereo3D English subtitling required for dialogue spoken in the Na'vi language.
Supervising Editor Roger Berger said that their pipeline was running virtually around the clock for the six months before the movie’s release. He has worked on other stereo3D projects before, including James Cameron's 'Ghosts of the Abyss'. He remarked on the high demand for finished movie footage for promotional events over the last few months, all before the shots were finalised. Each piece was typically 10 to 20 minutes in length and had to be produced while the facility continued to finish the movie itself. They also produced different versions and ran bake offs between them with James Cameron. One of the Pablos had 12,000 clips online, and it was necessary to use the Pablos to retrieve clips from anywhere in the movie at any time.

3D Na’vi Subtitling
The movie’s VFX shots, comprising around 75 per cent of the movie, were brought together at Modern from facilities around the world. 'Avatar' also had to be released simultaneously worldwide in both 2D and 3D and at three different aspect ratios. To cope, the facility set up a post production pipeline that includes three Quantel Pablo Stereo3D systems and DaVinci Resolve 4k. All the shots were re-named to a common naming strategy as soon as they arrived and are distributed onto the company's SAN.
From there they were pulled into the Pablo systems where they were first checked for stereo and technical quality, then conformed against multiple EDLs, requiring speed and accuracy. The pipeline continued with grading and finally, shots requiring subtitling for the Na'vi language were passed back to the Pablos. When subtitling the stereo 3D versions, the subtitles must sit at the right place in 3D space to avoid conflict with the 3D content. Their system allows them to view and reposition them in 3D space in realtime.

Life After Avatar
David Stripinis said he believes James Cameron has made inroads into the conventional progression from preproduction, previs, on-set shoot to post-production by adding many post considerations to stages preceding shooting. He also noted Cameron’s ability to control movie-making – including VFX, lights, look and camera moves and adjustments - as a director-driven real time process from beginning to end, and took more advantage of tools such as detailed, artistic previs  to communicate exactly what he wanted to production teams. Stripinis said that his ability to manipulate assets in real time extends a director’s reach during production.
It’s interesting to speculate whether companies’ software and hardware development plans will be influenced by the artists’ experiences in supporting stereo workflows within their ‘Avatar’ pipeline.  Mike Kanfer, for example, said, “‘Avatar’ demonstrates some very clever use of Adobe’s 2D and 2 1/2D applications by the Lightstorm team and other supporting vendors to solve specific 3D stereo issues within production design, previs, motion graphics, editing and compositing.” The expectations of artists and viewers, and the various companies’ ability to respond with R & D programs will be important factors in the future.

 
Words: Adriene Hurst
images: Twentieth entury Fox Film Corporation
copyright 2009 Fox All rights reserved
Featured in Digital Media World. Subscribe to the print edition of the magazine and receive the full story with all the images delivered to you.Only$77 per year.
PDF version only $27 per year
subscribe