|Cinesite’s VFX Supervisor Simon Stanley-Clamp travelled down to Tenerife for four weeks of on-site supervision for the facility’s main sequence, the battle between Perseus’ band of soldiers and the Scorpiochs, just before the shoot there started in May 2009. They had received the award, estimated at about 90 shots, in December 2008 and started work on the project in January. Warner Bros had said it would be a relatively short assignment with a tight, short shooting schedule. But it ran out to a 26-week post, and they worked on 176 shots in the final cut.
The shoot at Tenerife lasted three weeks. “I went out a week earlier with the 2D texture artist, a photographer and tracking lead Matt D’Angibau while the set was still under construction,” said Simon. “It took about a month to build, set in a national park at the foot of a volcano about 2000 feet up. In a big open area, the designers had constructed three sets comprising Greek ruins. We took a full set survey, detailed texture shoot and measurements. A company was also contracted to do LIDAR scans just prior to the shoot.
“We took hourly HDRI photography for lighting at the three locations, knowing that the crew would often be shooting in bright sunlight. We figured if we had light for every hour of the day, we should be covered.” Once the shoot started, Simon also shot specific HDRI of each scene as reference. “Things can change on the day – an extra reflector, diffuser or even an extra light can appear on set.” Throughout the shoot, Cinesite’s team continued to record an immense amount of data, some of which they shared with specialist matchmovers in VFX Supervisor Nick Davis’ team.
“Nick was keen on previs and hired a specialist group from Nvizage for three major sequences, the Kraken, the Medusa and the Scorpiochs battle. In the UK, some colleagues and I had gone to Longcross Studios where part of the shoot was underway, and took over the initial previs to finesse it and work in some of our own moves. Very early on, we were concerned that the crew was using our Scorpioch model in the previs, to make sure it could move properly in the shots, looked as it should and was rigged the same way.”
It has happened to him before when the previs is inaccurate to get the first edit back and find it doesn’t match the previs. By giving them the correct model and rig, they were able to avoid this happening. “Obviously, we could have conformed, broken our rig and made the Scorpiochs do almost anything. But we were confident in the looks and behaviour of our design and wanted it to be used. They should be menacing powerful beasts that chunder around, not flighty or floppy. We’d built constraints into the rig for these characteristics.”
Simon reflects that the Scorpiochs were under less pressure than the other major creatures, the Medusa or Kraken, to pay homage to Ray Harryhausen’s 1981 designs. “The earlier Scorpiochs had a smaller role, and were basically photo-real scorpions, smaller in size with none of the armour plating these later ones have. A reference we did make was the slightly staccato animation style, verging on a stop motion look in some shots.”
Simon’s team studied live scorpions at a scorpion farm where they shot reference video and close-up stills and based their models on these. They used the video to check their animation for walk cycles, agility, speed and to observe different species’ behaviour.
“Using Mudbox, the modellers developed about 11 freehand designs as model options. In the sequence there are five different sizes. They started with the basic middle-sized one and the others are extrapolations of this one, with variations like claw sizes, where the eyes were, how muscular they were and different leg designs. We avoided letting them look like lobsters,” said Simon.
After originating as CG creatures, Simon photoshopped them together as a series of boards with close-ups on the tail, underbelly and other components and presented them to Director Louis Leterrier and Nick Davis, who chose legs, eyes, claws and composed their own creatures. “We went back to Mudbox and re-worked the models accordingly. We went into turntables once they had them closer to what they liked. The signed off versions were exported to Maya to give a full build on the creatures with textures supplied, and the rigging that had worked best was finessed.
“Due to the speed of production they had been using proxy Scorpiochs for animation tests, dummy runs and walk cycles to show Louis and ask his opinion about moves in shots,” Simon said. “Meanwhile, we kept working on the design, modelling and texturing. Texturing is the last attribute to complete, and was under development throughout the shoot. We didn’t see them fully rendered until a couple of months after the shoot.”
Textures and colours became the chief means of differentiating the Scorpiochs, prompted by a change in the edit. The whole battle sequence was originally written in the script as three separate fights, and so at first was conceived as three discrete edits - Draco fights a middle sized Scorpioch, two brothers take on a smaller one, and Perseus tackles another. But when the editors had pulled down all the shots and created the first cut as in the previs, it simply ran too long.
At that stage, they decided to mix the three fights into each other, intercutting the edits. “You can see the overlapping fights – first a view of the brothers, then, Draco, then, Perseus -back and forth. But it was hard for the viewer to keep track of, so we decided colour-coding would help.
“The smaller one was made darker, shiny, more oily and leathery. Perseus fights a brown Scorpioch and Draco’s is quite orange. Shown next to each other, the differences are obvious but the colouring may not come across so well in the film. We experimented with more extreme colours such as a white one, to see if that helped but it drew too much attention and didn’t fit as well into the sequence or story.”
The colours were a late innovation but meanwhile dozens of new textures were invented and collected for the Scorpiochs. Simon was looking for a rhino or elephant skin quality, avoiding the look of a mollusc shell. Other kinds of reference came from insects, and the observation that yellow and black were common among dangerous animals like spiders and snakes.
Weight and Speed
Animating the Scorpiochs required scaling them to an ideal size and balancing their weight and speed to look believable. They ran extensive animation tests until they had a feel for the creature in motion, and referred to descriptions in the script of the way they should move. They had just completed animation work on ‘The Hulk’ giving them experience handling powerful athletic beasts. They envisioned it bumping into other objects, a driving force relentessly moving forward, even slightly stupid, constantly trying to break through without stopping to think and look around.
“To get the weighting right,” Simon said, “it’s important for the animator to be aware that he’s always working within a plate. Whatever or whoever else is in the plate is your clue to how your character should act or look, play off the other elements as your reason to duck, raise a claw. We had blocked out every single shot in advance and Louis directed them for speed, composition and drama.
“At that stage, it was like any animated feature – we do rough versions, look at them in dailies, cut them back in, see how it works with the shot before and after. We tried extremes of fast and slow, not looking at them in isolation but cutting them back into the current sequence and developing a scene. The full animation cycle of dailies, reviews, signing off, moving into lighting and more finessed versions all helped us understand how well it would integrate into the shots.”
The environment in Tenerife where the Scorpiochs were fighting was extremely dusty, and required a sophisticated particle simulation system. This took Simon’s team about six months to develop after the initial brief. “I’d been on a research trip with Nick and people from the other vendors. We’d seen some stills and were told that the shoot would be in a volcanic location with hard, crusty earth rather like a dried lake, and that stomping into it would send up large particles and big lumps of earth.” But on arrival they found small loose stones the size of marbles, covered with gritty sand and fine powdery dust that flew everywhere – in short, completely different.
“We had always been planning some kind of particle system, but for larger pieces and boulders. Each step on set produced clouds of dust, demanding a rethink. It took a while to refine, aiming to match the first plates we got after the shoot, again by lots of trial and error. By the end, we’d written a new fluid dynamics simulation pipeline based on Houdini and rendered through Mantra.”
Getting the lighting to work, the interaction and shadow play, was the trickiest part. The animation drives the dust, but then the Scorpioch needed to cast a shadow back onto the dust because it was so thick. The dust also casts a shadow on the ground. The animation involves several interacting computations and caches, and went round in circles before you could produce a part to composite, for each footfall. “We generated three passes of interaction– gravel, sand and dust – and often put in ground planes by going back to the original LIDAR scans, to make sure each step fell to the correct depth.”
The city of Argos was based on concept artwork from the production designers, loosely based on old Greek cities. A team at MPC also went out and shot more references in Mediterranean locations, in particular Malta and Matera, both wide shots, to see how the cities were laid out and the general look and also detail shots for textures and modeling buildings. Many were based near water to capture the look of the harbour location in the script.
From these references CG Supervisor Patric Roos and the CG team presented new artwork incorporating location based shots captured by the second unit crew in Tenerife, featuring dramatic cliffs and an old volcanic rock coastline. The location of the city was based on a real place, from where the production designers also supplied them with reference material. They chose what was most useful and retrofitted the material with the live action plates. “The city has a very specific role within the story as the place, set in the cliffs, where the Kraken can enter and prevent people from escaping” Patric said.
Locating the Camera
This was their start point for creating a CG world into which they could fit the live action set, shot in the UK. They had to be able to go from a fully CG shot to different shots in the live action ‘city’ location, one after another in sequence. So that these sequences made sense, whenever the camera landed into a street, they had to know when they looked up what they would see – cliffs, for example, or a particular part of the city.
“We had some flexibility in being able to move things around to get the layout and composition of images right, but we tried to locate the action accurately as far as possible, and did a lot of previs. Fitting the live action into the CG was a major task for us,” said Patric. His team also aimed to determine in advance which assets they needed to build at high and low resolutions.
Patric noted, “The CG and live action camera moves had to work together. Some shots were all CG, some had the live action in the foreground, some in the background, populated with CG characters. Because we could do whatever we wanted with the camera, it wasn’t clear at first which assets to put the most effort into, so we built most of them to a high standard to cover ourselves. Only occasionally did we have to go back to increase resolution.”
Fortunately, they had a good working relationship with the Louis Leterrier and Nick Davis, who helped with previs to show everyone what to expect in most scenes, how to narrow down the effort and focus on what was key for the story. The city in particular had to be manipulated to tell their story, so understanding how the director saw it was critical.
“Argos had to look as though the Kraken could really destroy it, for example,” said Patric. “It had to accommodate Perseus’ chase sequence with the Harpies. Clues like these, inherent in the story, helped us plan the CG more efficiently.
“Our 3D pipeline is Maya-based. Many of our proprietary tools sit inside of Maya like plug-ins, others are external, usually the ones for simulations. All rendering is done with Renderman, matchmoving with Boujou and 3D Equalizer. About 25 per cent of compositing was done in Nuke, and the rest with Shake, although this balance is changing toward Nuke.”
Specific to cityscapes, MPC has a layout department and layout tool. The workflow starts with previs and plates, while the layout department manages the assets, keeping them up to date regarding shot composition and the cameras to be used. For cities like Argos, this department is responsible for laying out the buildings into the streets. The layout tool was developed for ‘Prince of Persia’, which has a very large city, specifically for this task.
The difference for ‘Titans’ was the steep terrain, which meant customising this tool and laying out parts of Argos by hand. But one tip they took from ‘Prince of Persia’ was dividing the town into different ‘postcodes’ with specific layout characteristics – high-class areas, a palace area, market, slums. Similar areas could be populated by similar rules.
“But tweaking the elevation to give the terrain the necessary dramatic appearance, for example, changed the ground plain, sometimes creating ‘empty’ areas. Layout on steep terrain is tricky. This department’s work is really the backbone of CG shot production, ensuring each shot has all assets, updated and in place,” Patric said.
The people visible in the cityscapes were mostly CG, generated using ALICE, one of MPC’s tools originally written for ‘Troy’ in 2004, and based on motion capture shoots done early on in production. They knew in advance what types of action they would need to insert – such as people running up and down steps and, of course, panicked crowds for the final sequence.
“The insertions, although used very small, really add to the effectiveness of city, vary the colour palette and break up the profile, composed of geometric flat-roofed Greek buildings. The same principle applies to adding vegetation elements. The layout department probably added around 50,000 trees in the end, placed by hand between buildings and along streets. They added bits of smoke and other details to make audiences believe the city is living. But it is done with a process, not randomly. Backgrounds can be enhanced more broadly with matte paintings but foregrounds need attention to detail to express size and scale.”
Patrick has worked on cityscapes before, but none that had to function under so many conditions in a film. It had to fit in with a live action set design, they had to pull the camera in very close to it as well as render the whole view very wide, and zoom from wide to close in one shot.
“Normally, you can get away with doing a lot more with projection work,” Patric explained. “But we had to think of this project as if we were really building a city, and prepare in advance for most camera moves. The Kraken sequence was an unknown for some time. We knew where the cameras would be, but not necessarily how close. So we simply built the whole of it in 3D, even for projections, because we would need the geometry anyway.
“From their library of building types, the layout department laid out representations of where certain buildings should be to use in previs. As they went into rendering shots and look development, they swapped out the placeholder buildings for high-res versions. As a result, previs happened parallel to layout, parallel to asset building – all happening simultaneously. It was a testament to our pipeline that this regime could work.”
MPC’s most challenging creature character in ‘Clash of the Titans’ was the Kraken. Initial designs and a rough 3D model came from Aaron Sims. “We figured out the scale and made enhancements, like length of arms and fingers, allowing it to move and function properly with the plates,” said Patric. “Most of the head changed as well through the modelling phase, based on Louis’ and Nick’s direction. Remodelling was a long process, owing to its scale, but also fun and interesting. MPC also did additional concept work on the colour schemes in-house, using photography to be able to source similar reference for the texture artists.
“We did motion tests to make sure he could perform as a character. There's always a very collaborative effort between animation and rigging in our internal pre-production phase.” New developments were made on the rig, primarily for the tentacles, including a new solver and tools allowing secondary jiggle and the ability to stick the tentacle on a surface and create drag on the skin.
“The CG team’s work centred on building a character that would hold up to potentially any kind of shot, close up or from far away,” Patric said. ‘Close up’ at this scale is literally seeing the whole face or whole hand covering the screen. The challenge was in achieving scale or believable detail and applying this generally around a vast character. “We came up with a dynamic skinning solution that let us use a low-resolution rig to solve most of the muscle work and skin movement but at render time would tie to a much higher resolution version of the model, that let us put the camera almost anywhere. We could go in and sculpt out more detail and also got around problems like not having enough edge break-up.
“We also added a lot of photo-based procedural textures driven by mattes and a layered shader allowing us to add many levels of detail in the textures, so as you move closer it reveals finer detail. We were confident that it would hold up for any type of shot.
“When the Kraken turns to stone, the shattering simulation was done in PAPI, another in-house tool. We actually cut the big pieces by hand because the effect was tightly choreographed and they needed to be very accurate. In effect, we made a separate build of the Kraken, adding smaller debris and PAPI sims to enhance the look of the bigger ones, and finished it off with particle and dust simulations.”
Over the Ocean
The CG ocean in the Kraken sequence was MPC’s work as well, using Flowline. “We first create the ocean surface and from that surface, use different attributes like height, speed or turbulence to emit particle or volume effects, generating foam, spray and bubbles. These elements then emit another set of elements, such as mist and volumetric spray. Once you have all of these elements, you can break them up into manageable chunks depending on the shot and how much data you can throw at Renderman at once,” Patric explained. “Big shots easily had 20 or more elements rendered and hundreds of millions of particles mixed with volumes and sprites. With reflection passes of characters and the city and effects in the water, it all adds up pretty quickly.
“We wrote our own tools to bring Flowline data into Renderman, either as meshes or implicit surfaces. For Clash, the scale of the water was quite large so we relied solely on the meshing technique, focussing on the mesh tessellation to manage memory better at render time. The volume rendering is quite new in Renderman so we tried different cache formats. Again, it’s pretty memory intensive so we used it where we could. Close to camera we used sprites and volumes combined with live action elements, blended in compositing.
“Flowline is very good at creating water interactions. It simulates fluids in a 3D voxel grid in a container, and CG characters and props are added as colliding objects. We used a lower res version of our Kraken moving around and a version of the city specifically made to work efficiently for these solves.”
They had to find ways to increase the voxel grid density to allow enough detail to sell the scale where interaction occurred. They devised a hybrid method producing a generic ocean surface that let them add hydro fluids, or 3D voxel sims, locally and then blend them back with the ocean surface around the edges. The containers were smaller, and helped add complexity to the emitted elements from those surfaces.
“We kept the approach to lighting, and made sure the water shader was compatible with the rest of the shader library we use. We used image-based lighting, or IBL, for the water reflections and for the scattering component we used physically correct, measured scattering data to produce the right colours, depth visibility and scale.”
Pegasus took two forms. For most close-ups when he was on the ground, he was a live action/CG composite, while in flight, he was a fully CG creature. The combined Pegasus was based on shots of a black live action horse to which the CG team had to add wings. They decided that the best way to deal with a black horse with feathers was to extend MPC’s proprietary fur software, Furtility.
“The black colour of the horse added a special problem,” Patric said. “Because Furtility’s shader tries to be physically accurate and the parameters are quite sensitive, the horse’s shape would be defined by the specular highlights on the hairs. Normally feathers could be made on cards and modelled. But in this case, we had very short black hairs, with a specific look, adjoining the feathers.”
Focus on Highlights
Applying reverse engineering, they knew how the hairs would behave from the reference footage of the horse and used this as the start point for their extension, allowing them to instance CG feathers made with the same geometric primitive as the short hair, and to use the same shader. So, when the hair and feathers were next to each other, they behaved naturally and when you saw the highlight moving across the horse, it didn’t change from one look to another.
“The shader can be difficult to control,” said Patric. “But in this case we needed to let the feathers match the body. I’ve used feathers on cards before, shading and texturing them, but because the feathers and fur were black, dealing with the highlights became the focus of the task. The result shows that we can put a CG creature with a groom next to a live action character, and the two can be blended.”
The wings needed to be very big on screen, and some shots were at very close range. Among the most challenging was a close up of Perseus on Pegasus’ back in the final sequence. Actor Sam Worthington was shot against green screen on a ride rig, replaced later with a full CG horse.
Again thinking backwards, they revisited what they had done for their most demanding shot in the film, one with the Kraken in it. “On the very close up shots of Pegasus we had tried to displace the model but had trouble capturing the groom of the hair onto the surface, due to the base mesh for the hair and having to add displacement to the grooming tools.
I tried the skinning solution instead, which could add a high level of detail at render time and carry the groom as well. We could increase the detail of the mesh – revealing the veins under the skin, for example, by refining the highlights and specular of the fur, all of which improved the close up shots. The same applied to the feathers.
“From further away both appeared as simplified surfaces, but as the camera pushed in, the detail was all there. We could take it to as high a resolution as we needed and drop back to low-res easily, all handled in the pipeline. At render time we can dial the resolutions up or down pretty easily, and make approximations between fast and slow, close or far away moving shots.”
For both the full CG and the wing addition shots, they could use the same lighting and rigs. But for the wing-addition shots, the matchmovers had a separate rig to do the tracking and had a special tracking tool for muscle movement