Animal Logic flew into new territory for this project, thierfirst animated feature.
Lensing Director and Previs artist David Scott talks about capturng the story in stereo 3D, Digital Supervisor
Ben Gunsberger explains the character, light and environmental FX systems his team developed and
Editor David Burrows decribes puling the film together.


As Lensing Director, David Scott worked with Zac Snyder to determine the cinematic language for ‘Legend of the Guardians: The Owls of Ga'Hoole’, including blocking style and choreography, the cameras and shooting style. He started in about June 2008 when set design was well underway, and storyboarding had just begun. The script had been finalized enough to get started on previs, shot development and working out a 3D prototype of the movie with the editorial team.

Action and Drama
David worked through the previs stage with Head of Story Alexs Stadermann. They didn’t want to rely exclusively on either storyboards or previs but used whichever technique best suited each shot. Storyboards often worked best for drama sequences requiring emotion and expression, while previs suited the action and flight sequences, though this wasn’t a firm rule. “Some overlap occurred. We blocked out a crow chase scene thoroughly with storyboards, for example, but the final dramatic encounter between Soren and Cludd was previs’ed. But it was generally more efficient to choreograph anything to do with flight and plot out big, sweeping wrap-around cameras moves in 3D, and to use the emotion of drawn boards for drama.”

As they finalized preproduction and moved forward to animation, his role transitioned to the cinematography side of the project, figuring out how the cameras would relate to the characters and their performances. “Because Zac had a live-action background, he brought that aesthetic to this project. The team tried consciously to preserve that quality and not to force him into an animation workflow and shooting style. Discussions often started by asking how he might shoot a particular scene as live action, and then we worked out a digital equivalent for that.”
Camera Constraints

A CG camera can go almost anywhere and do almost anything, but David’s team put live action constraints on how the camera operates, from giving it a sense of weight and slight delay as it pushes in as if pushed on a dolly, to the way an operator might let the camera moves feed off the performances, including slight overshoots. After animation, they took a second cameras pass once the performance was in place to build in these small offsets that occur as a camera man reacts to the actors.

“On wide landscape shoots, we referenced footage that Production Designer Simon Whitely had taken in Tasmania, which we based the template for ‘Guardians’ on. Simon took a RED camera and toured the whole island looking for locations, hanging out the side of a helicopter without using a Wescam or any special equipment. His footage has a natural movement and flow as he tried to counterbalance the helicopter moves and adjust the framing. These were the moves we tried to emulate in our CG camera.”

Cockpit Perspective
David feels that virtual flyovers at four to five times the speed of a helicopter are exciting but may interfere with storytelling. To avoid losing the audience, previs helped on such aerial scenes as a battle in which Soren chases Cludd, shot like an old-style documentary or WW2 footage, swapping from a cropped in ‘cockpit’ perspective to a long lens shot with shaky camera work to give the look of a photographer trying to correct his framing. A proper BBC documentary, conversely, shooting birds from a cliff top with a 500mm lens, can distance the audience from the scene because it flattens out the background and suggests the true distance from which it was shot.

This gave rise to concept of shooting the action as if the owls themselves were carrying the cameras. David explained, "Earlier, Zac had suggested shooting these sequences like regular drama, that is, with a 35mm wide lens, so at first we tried using the same lenses in the air as we used for drama. But when he reviewed the takes, he decided the look was just what the audience would expect and said, “Let’s give them something different.”

Longer Shots
A further test with a 35mm camera involving a couple of owls, worked better because the wide lens let the audience see the background clearly and kept them close to the birds, with no break in dramatic style. “The change we made was holding the shots slightly longer and using a blocking style going back to ‘Paths of Glory’, essentially a dialogue driven film but blocked with characters walking into a close-up while the camera keeps moving along with them, one character stepping into the background as another walks into the frame. This was similar to the way owls flew into close-ups, maybe moving back as another bird entered, keeping the shot running. It had the effect of a camera sitting on a bird who flew with the group.

“It didn’t take as much trial and error as you might expect. Zac let the team know fairly precisely what he wanted. Instead of working shot to shot and then telling the story in the editorial suite later, he spent whole afternoons with the previs and layout crew explaining why he likes to block scenes in a certain way, his camera style and philosophy about slow-motion. These sessions established work principles for the whole show and ultimately saved time.”

Real-time DOF
The team started working on the previs in standard Maya but, due to the prevalence of night shots, David wanted to shoot with a shallow depth of field with the aperture wide open. To encourage natural shot development, Autodesk helped them install a real-time depth of field component into Maya, early in the process. David reckons they may simply have borrowed the Mudbox component but, in effect, it allowed them to monitor how the depth of field would look as they went. Instead of just setting up planes for far and near distances, they could dial in a dof setting and quickly get a reasonably accurate look and make pull-focus decisions even in previs, which in turn influenced the whole shooting process. This was the only major departure from the regular Maya camera.

Throughout previs and layout for animation, they used simple 4K parent constrained rigs. David feels this prevents over-animating and keeps the focus on the basic choreography. “On the camera side, the Supervising Layout TD Jeff Renton built a few camera rigs allowing us to design shots where a character and camera were parented to the same rig. We also had crane rigs, equivalents to most live action rigs, which helped the artists to think and work in a live action style.

Back to Basics
“Going back to basic camera moves, the pan comes first, then tilt, followed by the roll. In Maya. The rotation orders are quite different, but we deliberately set our rotation orders like real cameras. As well, building a crane rig in Maya will give a very different look to the shot. Without it, the free camera can just be translated across your scene in a straight line but if you parent it to a crane rig and off-set your pivot point, you can animate the rig to give a beautiful curve to the move, part of the cinematic language.

“A boom shot in a live action movie won’t just be a stationary translation from A to B – it will include an arc. Also, the pivot point in Maya, by default, is exactly around the nodal point of the film plane location. So we offset that to make it pivot more realistically.
To help shoot the final stages of animation, camera operator Calum McFarlane at AFTRS taught the team about using 35mm film equipment, the weight of the cameras and the feel of real crane rigs. It was useful to understand just how much work a camera man has to do to control his cameras. That changed several shooting decisions, regarding off-sets and other moves.

In Parallel
During previs, David’s team collaborated with Simon Whitely, the Art Department and Art Director Grant Freckleton. Because the pipeline at Animal Logic is built to run processes in parallel, set design continued to develop as the shots were designed. “For example, as we enter the Great Tree for the first time, the Art Department did spatial studies with basic structural shapes in place as rough models, just as we started developing shots. We put rough cameras through the model, which went to editorial, and then Simon would make a few adjustments to the set, with the cameras in mind.

“This procedure was typical throughout the production - set design would start at a very early stage with the previs, which went to editorial. As shots were refined and entered production, set design and art direction would continue during final layout, cameras and character choreography. Once animation started, there would be further iterated versions of the set.” Thus, all work was interrelated. Having the dof so early also contributed to the process. If they knew that certain shots would put the background out of focus, the artists could save time on detail.

Performance Motivated
Animation was another major lensing component for David. Animation Director Eric Leighton would explain his intended moves and actions, which David would plan for in the layout. The cameras were intended to be largely performance motivated. When the animation crew were ready to start a scene, David would be ready to assign cameras, exported from Maya to XSI, so that animation and camera work could develop together, similar to the set design process. The animation blocking stage would get one camera pass, with further passes to follow as the scene built up, feeding off the performance.

After production got underway at the end of 2008, the bulk of the animation was completed within six to seven months. David attributes some of this efficiency to the parallel pipeline. “We could finish a camera move, publish it and the next day see a render of everything associated with it – the latest animation, set design, effects and so on. Lighting could start as soon as animation was underway, without waiting for the final assets, and by the time animation finished they would only have to update the final performances. So, only a couple of days might pass between completion of animation and the final render.”

Camera Export
The final depth of field was applied during compositing, though the design of it and the focus-pulling came directly from the camera operators. When exporting a camera from Maya to Nuke, the compositors could pick up the camera with all keyframe information for the dof ready to go. There, further small adjustments could be made, but the point is that the design came from the camera operators. David feels this is unusual to find in an animated film.

For battle sequences, key action points were choreographed first for the animators’ first pass. If background characters were needed to fill in, the crowd artists would add a pass. But for the initial previs and choreography only the essential characters were involved. Animation generally comprised a blocking pass, a rough body motion pass and a ‘clean up’ pass for facial animation, and the camera was involved in all three stages.

Recognition
“Probably my favourite sequence was when young Soren is leading the birds through an intense storm to the Tree and loses track of his friend Digger,” David said. “Virtually without dialogue, the scene’s drama comes mainly from the cinematography and the way the set design works with the camera moves. Instead of becoming a chaotic action sequence as Soren’s fear that Digger is lost grows, the sound drops and the shot goes into slow motion. He has a moment of recognition that his friend may actually be dead – though in the end Digger is saved by the Guardians – and from that moment until he lands at the Tree, we were shooting at 48 fps, slower than it would normally have been shot, adding weight and gravity intended to read Soren’s thoughts and emotions.”

Slo-Mo Pipeline
“Slow motion was always used with a specific intention, frequently for clarity. In battle sequences, for example, gratuitous action for the sake of excitement was avoided but when it was called for, Zac often chose to use slow-motion at the most intense moment, slowing right down to let the audience clearly see the emotion for each character and their relative positions. An alternative to editing action sequences into a series of quick cuts culminating in a close up or single, dramatic moment, he preferred the slow-motion technique perhaps because it prevents a break in shooting style.

“What proved more difficult and required some R&D was applying the slo-mo uniformly along the pipeline. In 2D you might render out some extra frames and do a time warp in the composite but the stereo element required also making sure all details still matched, eye to eye. So we developed a re-time curve, established in layout, that could be shared among the departments to align effects, animation and so on.”

Ground-up 3D Stereo
The team wanted to avoid adding 3D at the end of production as an effect, and developed the whole movie with stereo in mind. When layout began, for example, an initial pass would be made to establish correct spatial relationships, and in animation, a corresponding 3D pass would be made.

Shot composition didn’t always work in the same way as in 2D. Letting objects move out into the cinema was an additional consideration. In an over-shoulder shot, when the character being looked at is out of focus, the audience may be more inclined to look at the shoulder, the object nearest to them, which may appear to be out with them in front of the screen. They aimed to avoid this problem by always keeping the screen as the foreground object, the ‘shoulder’ character in the middle ground and the one being looked at in the background.

At first, they tried to avoid out-of-focus objects altogether but this would have interfered with their filmmaking style and shallow depth of field. “It simply took more attention to detail, ensuring that the frame was our stereo window that everything must sit behind. Nevertheless, we still wanted to take advantage of the 3D to create an immersive as well as cinematic experience. Rainstorms, flying through fire were such moments when we actually let certain FX elements out into the cinema space. In flight sequences also, to enhance the feeling of the camera flying among the birds, characters might fly from the theatre over into the shot.”

Shooting Style
The live action shooting style was probably the greatest challenge of the project from David’s point of view. “In most animated features, the animation of the camera takes a secondary role to the performances but Zac’s background centres on the camera. He is a former DP himself on his commercial projects. On ‘Guardians’ he was absorbed in the choice of rigs and lenses and wanted the camera to have a major role, not just as an observer but as a participant. He wanted to feel the presence of the camera operator, although not his personality.

“Without a dedicated toolset for this, it was a difficult quality to build into the camera work and we mainly achieved it by hand. In an emotional, dramatic sequence we might use unstable, dynamic moves with more adjustments to add tension and uncertainty. The practical camera training at AFTRS helped us do this, and because several of the operators had also helped during previs, they knew the scenes quite well. It was also a challenge for the animators, accustomed to working with locked off cameras without any unexpected moves, so this interaction between camera operators and animators was another new element.”

Environmental Challenge
Digital Supervisor Ben Gunsberger had worked at Animal Logic earlier in his career, working on various tasks, and then moved to PDI at Dreamworks to work on ‘Shrek’, where he specialized in lighting. He continued in lighting on the ‘Matrix’ sequels after that, and then returned to Animal Logic for ‘Happy Feet’ as Lighting Supervisor.

On ‘Legend of the Guardians’, Ben divided the role of Digital Supervisor with Aidan Sarsfield. Aidan concentrated on the character performance side of the job, while Ben was working on lights and backgrounds, rendering and environments. His greatest challenges related to the looks of the characters. “Characters have to come first in any production,” he said. “You can almost get away without the environment not looking perfectly right. While we had plenty of other concerns – from fire to massive environments – we were always focussed on creating engaging characters.

Heavy Feathers
The major problem was feather systems, and generating feathers that would look good, move well, not intersect, and allow the artists to control and animate them. Work started early on feathers, building on some systems they had set up for ‘Happy Feet’. “Because the penguin feathers had been so fine, almost like fur, intersection wasn’t such a problem,” Ben said, “whereas owl’s feathers are larger and more visible. The collisions became a bigger issue.”

To address the problem, they turned to their proprietary toolkit running between Maya and Softimage XSI, called ALF, Animal Logic Fundamentals, which the feather system could plug into. Called Quill, the full system comprised grooming tools in Maya that the surfacing artist used to set up the placement and flow of the feathers, animation tools from XSI that let animators work with and animate the feathers, simulation tools for character effects artists to create wind simulations and, finally, collision effects to handle feather collision either on a single character or character-to-character.

Super Simulation
Refining this system required collaboration between several departments to work properly. Given that owls can, for example, rotate their heads 270°, tuck in their wings and would be seen in super slow-motion and extreme close-ups full screen – tight collaboration between modelling, surfacing rigging, animation, effects, rendering was critical. At times, a slight change in the model made the simulation work better, resulting in endless re-iteration through the entire pipeline, getting all parties involved with these components together to understand the impact of their work on the simulation.
They continued tweaking until quite late in production, changing poses and individual feather animation, because of new actions and scenarios the characters might have to perform, although the team had tried to make the system robust.

“Textures and surfacing are among Animal Logic's strengths but since 3D amplifies detail, viewers become quite sensitive to texture,” said Ben. “We often reviewed looks in context. Even when building props, we rendered the whole environment together through the shot cameras to check consistency in detail and maintain believability.

Texture Library
Animal Logic’s texture and surfacing toolkit ‘Impasto’ lets an artist layer up different textures. As they started building the film’s elements, they also began building up the texture library. The feather shader, for example, was tackled early on – a complex shader with built in subsurface scattering, translucency, and a specific specular model to produce the correct ‘sheen’, with a variation for wet characters.

Rock textures took a lot of thought as well, because rocky environments occurred in most scenes. “Rock can risk looking dull and lifeless, so we looked for distinctive qualities to make it more interesting. Metals were also important to the story. The Guardians’ gleaming armour represented an essential part of their culture and needed to show strength and a magical quality. It had to reflect and pick up light correctly, and contrast effectively with the Pure Ones’ functional, menacing weaponry.”

Similarly, the team began updating water systems for ‘Guardians’ that had been set up earlier for ‘Happy Feet’. The Art Department painted a variety of foam textures because most of the water shots show it crashing around rocks. “We also shot a lot of reference footage of foamy churned up water to place around the environment. Because the scenes with water were confined to isolated shots, we could spend time getting the precise look we wanted, especially for the storm sequences that used many, many layers for the water surface itself – spray, snow, atmospherics – all of which had to work in 3D as well, and look balanced in the composite.”

Reading the Air
The use of atmospherics was a conscious choice. Even in the ‘Legend of the Guardians’ books, the air is described in detail. Part of the owls’ story is their ability to read the motion and shape of the air, so they wanted the audience to visualize the ‘thickness’ of the air – even beyond rain or snow – through particulates such as dust motes picking up light. “Air is not empty but rich. 3D expanded our scope for achieving this quality. Even the familiar device of god rays took on a new look.”

The effects teams went through a phase early in production, working through the script and deciding on the effects they would need, before they created any previs or had any actual shots. They set up a library of effects and worked out the techniques they would apply to each one. Then when they did have the shots, they would only have to tweak the technique to work in each shot.

Making Eyes
Ben finds that the way eyes are treated in most animated films is art directed and deliberate, like 2D animation in which highlights are drawn in a stylised way. In this film the team looked for a more natural feeling, instead letting the lighting in the scene generate reflections. “You can look for the character that another is talking to, reflected in his eyes. This helps fit the character into the scene and makes the eyes look vivid. Plus, an owl’s eye is so large relative to its head that we had room for lots of detail. We used complex shaders in these eyes for reflection and refraction, and made tests to ensure they looked right from all angles.

“A real owl’s eyes don’t rotate, but are fixed in their skulls. Originally we wanted to remain true to nature but found it was hard to identify with characters with immobile eyes. They felt lifeless, so we departed from reality on that point, though they do move less than human eyes.”

Owl Research
Helping with owl research were their ongoing relationships with the Australian Museum and zoos. Production Designer Simon Whitely shot extensive footage at wildlife sanctuaries in the UK of owls flying around, landing, eating or coughing up pellets. Whenever they spotted some distinctive, peculiar behaviour, they tried to find a place in the film for it.

Ben has a strong background in colour and photography, which he finds useful on projects like this because looks are often considered in a photographic language, although the team is not dependent on real camera optics for their images. “We are conscious of focus, composition and exposure. Lighting departments often include photo experience. It’s a useful combination of technical skill and aesthetic sense, as well, like computer graphics itself.”

Hero Crowds
Crowds were handled shot by shot, choosing either a crowd system or individual hand animation depending on circumstance. Again, they augmented a crowd system they had developed for ‘Happy Feet’. One important change was expanding this system to use the full hero character set ups. This meant the camera could move right up close to anyone in the crowd and still have the full level of servicing detail. The original system had required special low-res models to be built, but now, they could use their full resolution models and adjust the level of detail according to camera position.

“This was useful because at the time they were setting up the shots, we never knew exactly where the camera was going to be,” Ben said. “Some shots move right in, but the servicing still holds up. We did have some takes out of previs and Layout to give us a sense of where crowds would be, but in the original design phase we might only have had a few lines of script saying a scene needed “hundreds of owls”. So we had to balance practicality with resources, and developing a good look against giving freedom in storytelling and capturing great shots.” Previs could be used to help solve problems and show them how they needed to set out the environment to suit a scene.

Night scenes had to be lit with care. As the active time of day for owl life, they couldn’t look odd or exceptional with a typical blue-toned ‘night time’ look, and while night shots had to look normal, daytime was made to look super bright. Any shots with sunshine used the sun as an almost overpowering light source. Lighting was set up in Maya and rendered out to Renderman. Compositing was done in Nuke.

Render Management
Overall, rendering was a challenge prompting the teams to constantly optimise as many processes as possible and avoid unnecessary rendering. At the same time, their pipeline always tries to accommodate their iterative approach to projects, but it means that many departments – not only lighting and effects – are using the render farm at any time. They worked out a system where the animators could check in their work to process overnight, rendering the full quality feathers, for example, each night so they could monitor progress.

“All departments did the same. Surfacers could submit renders of their characters, and environment artists could see their work rendered in context. All teams were rendering all the time, so that the final, lighted renders of complete shots only constituted about 50 per cent of the total production’s render time,” said Ben.

Deep Composites
Some shots needed up to 200 passes, in extreme cases, but more typically the artist might have to split out each light and all shading components – speculative, diffuse, scattering and so on - and all effects elements. They also had a ‘deep’ compositing pipeline. “We could render out each pass as a 3D deep image, instead of just in 2D, and recombine them later in the composite. We didn’t have to render holdouts for all the different elements. We could render, say, characters separately. If there was only to be an animation change, for example, we wouldn’t have to re-render environmental or effects elements. This facility saved on re-rendering and really helped encourage further iteration.”

Holdouts are usually necessary in a conventional compositing workflow. If you needed to put an owl inside fog, you’d render the fog with the owl represented as an empty shape, or holdout, inside the fog and render the owl separately, and then render the owl and fog together later on when both have been independently refined. A deep composite on the other hand lets you render the owl in one pass, the fog in another.

Flexible Fire
The brief for fire in the film was demanding, calling for qualities described in the original books. It had to look natural, work close-up and in slow-motion and be art directable due to specific shape requirements and the need to interact with characters. Though they considered different third party options for software to use, none seemed able to perform all tasks and they began developing their own.

“This fire had to be scalable, sometime covering huge spaces, and work fast to play back in a reasonable time. To be both fast and large-scale, it would have to simulate across multiple machines, not a common ability in fire simulation until recently,” Ben said. “We developed a multi-CPU, multi-machine distributed tool kit they could use across everything. We worked on it through to the end of the production, but ended up with a robust system that let us handle shots we hadn’t been sure how to approach before.

“Also, because it was part of our ALF system, we could take fire data out and use it to move particles around, bring character positions into the fire system, and take fire data back out into the feather system. All information could travel back and forth to the various parts of the pipeline. This allowed characters to interact correctly with the fire, and fire with smoke, all of which make such effects more believable. Sometimes smoke can use the same simulation, such as when the fire becomes smoke but other times, smoke was independent of fire, when they needed to interact.

Physical Time/Screen Time
When dealing with slow-motion, once the correct effect had been achieved by applying speed ramps to the animation, they checked how it might affect the rest of the shot. Once it had been slowed down, they could still change the animation to accentuate certain details in the slowed sections. It also had a bearing on the other visual effects. “Because we were doing physical simulations with real physics, we needed to have time as a constant, so we needed a way to map real physical time against screen time. Using this approach we could pass things back and forth from physical to screen time. Departments could work in whichever time they needed, but it made communicating about our work a little confusing sometimes.”

Weather elements like fog, rain and snow generally had to be handled as separate elements, determined in the initial look development phase. They would consider, for example, ‘storm’ as a concept and work out exactly what elements they would need to make storms look a certain way. Then, as they worked through the script, they could apply the same recipe to various shots involving a storm, but customised for each scenario. The elements themselves will have to be developed individually, at times by different types of artists, to get the work done.

Working in Context
The edit was critical to storytelling and continuously changed, of course, so they developed a system at the start allowing anyone at anytime to see their work as it would appear in the current edit. As the edit progressed, production published the EDL. “You could load up and see the scene with your work in its latest version. Animators could see how the motion played from one shot to the other, lighters could see their lights working with animation, or vice versa.

“Edits can put a department’s work out of sync so this routine was useful for staying on track. Animation and effects are crucial to each other and need to be viewed in context. Conversely, editorial made efforts to gather the latest versions of FX and light and see how well they would cut together. Looking at a scene planned only on storyboards will seem very different with rain, snow and light added to it.”

Living in 3D
Animal Logic received advice from a number of sources on the stereo 3D for ‘Guardians’, especially regarding how it had been achieved for other companies. They chose Tim Baier as stereo supervisor. Tim had spent two years touring the outback of Australia with single and paired cameras, capturing images from extreme close ups to wide landscape photography from a microlight aircraft, all in stereo. He had some original suggestions on how to approach the 3D aspect of the project. The rule about keeping objects behind the stereo window originated from Tim, for instance, but he also maintained that foreground objects could be allowed out of focus if they were correctly placed in 3D space.

David has heard of some companies assigning one stereo rig to each character and one for the background, all of which would be adjusted later. But as they were only intending to run one final render per ‘eye’, Tim helped them work out ways to simplify the shoot that didn’t require so many extra passes.

A critical technique they used was called ‘cut-cushioning’. David explained, “All shots were rendered with overscan, left and right. After all the editorial decisions had been made and we had the final cut, we could offset each eye in order to determine how far away an object should sit from the screen. These 3D decisions could be left to the absolute last minute. After editorial, lighting, everything had been finalised we could still take one last pass and determine where things were relative to that stereo ‘window’ – at, behind or in front of it.

“Consider the importance of effective match cuts in editing 2D action sequences – that is, correctly locating characters moving from one shot to the next, leaving and entering the frame at the same point, for instance. For 3D, you also have to maintain his position in 3D space. To this end we used cut-cushioning to adjust the left and right eye in the composite and determine how deep into the shot he was. Then by animating those positions over time at the beginning and end of each shot, we could customise its location, sometimes only over a few frames bringing someone from back to front, to ‘cushion’ the eye and make the viewer more comfortable.”

Cutting A Legend
After Editor David Burrows had worked at Animal Logic as a visual effects editor on ‘Happy Feet’, putting together layers and elements for shots, he stayed on as the company prepared for ‘Guardians’, including the original test that led to the film’s approval for production in December 2007.

In early stages of production, David helped the effects and animation artists monitor how their work was cutting together. But for an animated feature, in fact, the initial scene cutting comes before this stage and is done on the voice recordings. The voices were edited together into scenes based on timing and performance, resulting in a ‘radio play’. Next, the productions’ large team of storyboard artists blocked out the script into storyboards, which the editorial team cut to fit their radio play, producing an animatic to look at the way the voice/action combination was working out for timing, pace, humour and so forth. “This is a very creative period, a chance to try out any ideas without having to commit to anything because it’s still very early in the production.”

Commitment
When the animatic is running well, however, they start committing scenes to 3D animation in the layout process, where the cameras take over the storytelling. David doesn’t want to make the process seem like an assembly line – it actually works more as a loop. “We might go back to the ‘board stage, recut a scene and bring it back to Layout to make a new shot. We continuously refined scenes but did so less as we get toward the end, when re-doing work becomes more labour and time-intensive. We edited with Final Cut Studio, using FCP for the offline with quite high resolution ProRes 720p HD images, which eventually were conformed at the higher resolution for output.

While all teams were working with 3D stereo in mind, editorial was mainly editing in 2D. “Editing in 3D is quite difficult. Ideally we edited in 2D, although if we wanted to check, say, a fast paced battle scene, it was easy to conform it in 3D, put on some 3D glasses and assess it up in our theatre here. We cut in 2D and reviewed in 3D but in the end, I don’t think the stereo factor really changed our decisions about the work.”

Comfortably Immersive
A lot of effort went into making this both an immersive and a comfortable 3D experience. The cut-cushioning technique [see also David Scott’s description in this article] was important as we approached an edit. Each eye, or stereo window, animates closer or further apart depending on we’re focussing on and the desired effect. It visually softens the ‘blow’ from cut to cut. Before this process was adopted, coming from a wide shot to a close up could be quite jarring on the viewer’s eyes. I feel Animal Logic has made some progress in 3D shooting in this regard, making the effect a part of the story as well as comfortable.”
Some fighting scenes were blocked out live on a sound stage with Director Zac Snyder’s stunt team dressed in cardboard owl’s wings. Their performance was videoed and used quite faithfully to choreograph the fighting and choose camera angles, via the previs department. “The scene near the end of the movie, when Metal Beak is about to kill Ezylryb and Soren flies in, was one such scene that worked out very well. We had Zac’s video running in one corner of the screen and the shot in the other and could see how closely the two matched,” said David.

Straight to the Heart
As on any film, David was trying to cut to the emotional heart of the scenes. Previs helped them do this on the more FX heavy sequences, for example, when the birds are flying through the storm involving dramatic camera moves, rain effects and lighting. Very little had to change from the previs, even once the many layers – light, rain, feathers – were rendered in.
David’s favourite scene occurred when Grimble is teaching Soren and Gylfie to fly. Nyra comes in and the two must escape but Soren hesitates, wanting to persuade his brother to come with them. A fight ensues and Grimble is killed, giving Soren the mission to go to the Great Tree. Finally they escape and suddenly realize that they really can fly for the first time. Beyond the excitement is the emotional pull between the brothers, Nyra’s treachery, the narrow escape and the mastery of flight. I had spotted its potential when I read the book as a really big moment in the film.”
Throughout production the editorial team was under pressure to deliver a product, and technical skill was crucial at every stage. But equally important, David reflected, was having the passion for the craft, the story and seeing the whole project brought together with input from all departments. Editorial was a hub for the production. They cut together the original scene for the teams to bid on, to figure how many shots it would need and how much it would cost. As the completed shots came back from the departments, editorial were again responsible for the conform of the final movie and delivery.

Words: Adriene Hurst Images: Courtesy of Warner bros Pictures
Featured in Digital Media World. Subscribe to the print edition of the magazine and receive the full story with all the images delivered to you.Only$79 per year.
PDF version only $29 per year
subscribe