Iron-Man-

The visual effects team at Double Negative completed a complex sequence set in Monaco on the racing circuit at the Grand Prix for ‘Iron Man 2’. It gave the artists a chance to show their skill in everything from on-site supervision to CG environments and 3D modelling to fight choreography.From Digital Media World Magazine


Iron-Man-19

Iron-Man-1Iron-Man-10Iron-Man-11

Iron-Man-12Iron-Man-13

Much of the environment work was focussed on the 'fight area' where the majority of the sequence takes place, involving about 250 shots. This is the waterfront section of the grand-prix circuit between the turns named ‘Tabac’ and ‘Piscine’. Double Negative accompanied the 2nd unit crew to Monaco in May 2009 in the lead-up to the real Grand-Prix to shoot digital stills, collect reference video and gather LIDAR and survey data for the entire circuit, and a large section of the surrounding city and local terrain.

Monaco Survey
“We Lidar FX-scanned the track itself and some of the surrounding buildings. We also took 180 panorama ‘STIGs’ from each side of the track every 10 to 15 feet all the way around the track. This gave us the necessary information to recreate reflections and interactions onto the race cars, the Rolls Royce and Iron Man,” said VFX Supervisor Ged Wright, who lead the team with 2D Supervisor Victor Wade. “STIG is proprietary software we initially developed on ‘Batman Begins’ that allows very high resolution stitching of stills photography panoramas. This plus other in-house tools made some of the tasks more straightforward. Our 3D pipeline is based on Maya and RenderMan.”

The team, a diverse group of 140 at the peak of production, needed all of this precise information to be able to reconstruct the entire scene with photography and simple models. However the most detailed area, the fight area, was to be modelled in 3D, in which the stands, fencing, barriers, harbour side boats and other elements all came together as a complicated 3D build allowing them more flexibility later in production.

Moving to LA
In June, the principal photography got underway at Downey Studios in Los Angeles where a partial re-creation of the fight area had been constructed. Double Negative travelled out and collected extensive reference and data of the Downey set, again in order to build a digital equivalent to incorporate into their wider digital Monaco. In fact, they ended up stitching Monaco into the Downey set - since that was where principal photography took place, it became the simpler solution.

“We ended up with 7TB of raw stills data, which is a very large amount of data to wrangle,” said Ged. “Somehow, no matter how thorough you are, something always seems to get missed, but on this show there wasn't a great deal we didn't have reference for. All CG shots are really challenging. We ended up doing more than we had planned because the only way to get the shots where they need to be is to keep going until you run out of time. However it all proved vital down the track, to serve as the basis for the digital Monaco.”

The 500 ft long set built in Downey studios was accurately re-created, including the track, barriers, chain-link fence, stands and press-box, all modelled and textured to hold-up full frame. This set build was then further extended to incorporate authentic Monaco surroundings including 15 yachts to populate the visible waterfront. Again, these had to be modelled, textured and dressed to hold-up full frame in shots. Beyond the main fight area the Automobile Club de Monaco Head Quarters and ten background buildings were also created.

Cars and Crowds
Further still, beyond this set build, distant Monaco was recreated using high-resolution digital stills shot on location and combined into a 360-degree HDR, gigapixel panorama, or cyclorama, using STIG. This was exported as multiple high-resolution matte-painting tiles to remove all foreground objects and a race-day feel was given to the entire sequence, adding race marshals, flags, crowds and banners to the buildings before being recombined to serve as a Grand Prix backdrop covering all directions.

Before Whiplash attacks Stark, viewers follow the Historic Grand Prix race through Monaco’s streets. Throughout the race, every car you see is digital. Double Negative created two different, detailed digital versions of each of the 11 cars appearing in the sequence for a total of 22, based on five cars the art department built especially for this sequence and six genuine vintage GP cars brought in to make up the rest of the grid. The non-digital cars are only seen static on the starting line before the race begins. 3D Lead Jenni Eynon handled much of the fight area environment build and the cars.

Throughout the entire Monaco sequence, crowds to populate the Monaco stands and buildings were generated with photographic techniques. First, Gavin Harrison from the team used a 2D sprite system – a procedural system to distribute filmed elements of crowd people - in Houdini. A 3D set up was also used, achieved photographically with 16 stills cameras arrayed around the capture volume. The captured crowds were projected onto geometry constructed using a system called Double Vision, a stereo reconstruction tool that Double Negative developed when working on ‘Quantum of Solace’. It uses a stereo pair of images from a scene to reconstruct its geometry by analysing the slightly offset views of objects in the images. Then to break up the crowds projected onto this geometry, CG agents performing specific actions were added into the crowds.

Whiplash Attacks
Double Negative designed the look for Whiplash's whips and the extensive damage they cause to the cars, racing track and Iron Man himself.  The Whiplash set-up is meant to look rough and homemade, like a prototype weapon instead of sophisticated technology. Whiplash designed the whips to cut through Iron Man's armour by generating a stream of plasma down length of each whip. Large quantities of electricity tend to leak out and arc to nearby metal objects.
The team decided to give them a plasma core and tried several approaches, settling on a dangerous-looking, electrical effect. However, they had problems developing the effect to work across all the shots in which it appears. They all needed individual tweaking and art directing to feel right, shot by shot, which was time-consuming but necessary, because part of the attack sequence takes place before Tony Stark has suited up as Iron Man.

The whips’ effects, based in Houdini and achieved by effects lead Joe Thornley, also involved an animation phase. Whiplash actor Micky Rourke was more at ease performing with real bull whips in his whip handles than with nothing there, which meant that most shots had this practical element to be painted out, but it also allowed them to see what the animation should show.

Molten Metal
Animation Lead Andy McEvoy Worked on the animation in Maya, from which a large amount of data could be exported to the curves and locators in Houdini. From there they could generate all of the electrical effects, interaction effects and gouging of the ground. These were passed back to Maya and rendered, along with the practical whip.

Numerous effects passes of different types were needed to gain control in the composite to give it the look they were after. The look also depended on how quickly the whips were moving and how large they were on screen in each shot. Ged said, “Adding to the difficulty was the fact that the whips were electrical and tesla-based – they look very effective when arcing between two stationary objects but as soon as they were thrown around in the fight, the whips’ effect got more complicated.”

The whips also produce other effects such as sparks, molten drops of metal, grounding arcs and show a visible trail when they are swung to strike. The idea was to make them similar to industrial metal cutting equipment, well-integrated into the surrounding environment. They gouge the track surface, slice through several of the racing cars and eventually tear into Iron Man. While the car slices were achieved as practical effects, they required augmentation with photo-real CG sparks, embers, molten metal, smoke, dust and flying chunks of debris.

Their most complex FX task in the movie was showing the whips interacting with Iron Man. This 'Thermite' effect, named after a type of pyrotechnic metal reaction, combined all of the other whip effects to generate smoking, molten metal streaming from Iron Man when the whips wrap around him. VFX artist Eugenie Von Tunzelmann created this in Houdini, where she could develop what it should look like and then set it up as a script for other artists to use across multiple shots. They’ve found that, once artists are familiar with Houdini, they can create effects with less scripting than with Maya.

Mk V armour
Showing Tony Stark suiting up in his new Mk V armour from a suitcase was a major puzzle that Double Negative were asked to unravel, comprising 50 shots at the end of the sequence featuring the new Mk V Iron Man 'suitcase' armour. This portion of the work posed a couple of different challenges. First, the team dealt with making the deployment and assembly of the armour plausible. “Originally we had envisioned it would be fairly flexible or organic, with the various plates that form from the suitcase sliding over him. But this meant making this suit behave very differently from the previous Iron Man armour. We ended up locking down the pieces so they wouldn’t move so much and felt more like the previous incarnations,” Ged said.

The former Iron Man film was used as a reference for this sequence, because the director wanted to maintain the character’s look. But this suit’s behaviour was in fact quite new. They had to make it grow across the actor’s body without letting it feel as though the parts were moving by magic. Eventually they went back to quite a simple, mechanical look that didn’t demand that the audience believe ‘too much’.

Made up of about 3,000 parts, individually modelling, rigging and texturing the components of the suit was daunting, although these tasks could be automated to an extent. As the fight plays out, several stages of visible damage are also called for, all requiring further hand modelling and texturing. “The greater challenge was creating the parts in a form that was usable or available for Renderman. We haven’t found an elegant solution for this problem yet, but we are working on it. For this project, we handled it partially manually and with a lot of scripting,” Ged said.

Suitcase
They studied the Mk III suit-up in the first movie, knowing that they would have to compress the complete suit-up gear into the suitcase. “The suit-up sequence itself was handled as a separate section with a separate team. We had to consider, ‘How do we make sense of something coming out of a small, virtual volume and covering a character – without making it feel too magical or artificial?’ We started by looking at the fully formed suit and seeing if we could fit that into the suitcase – and working backward from there. ‘What is each shot trying to communicate?’ We needed each shot to deliver a clear message. Jon Favreau didn’t want us to make the audience feel we were cheating too much. They needed visible evidence of what was happening.

“During the concept period on this suit, a collection of stills was used to communicate exactly how it would form over him, while making small, invisible cheats. This method got us through the look but we needed to progress into 3D to solve the next stage - the performance aspect of the scene. Defining this performance took some time. The final outcome was a combination of motion capture and keyframing.”

Fight Choreography
Animation lead Paul Davies took the action of Downey’s stunt double, which was filmed from four or five different cameras, with a result that was similar to a motion capture session. As Iron Man's face is expressionless, his posing and movement are critical to convey the right attitude.
But the footage from this performance didn’t quite give them the reference that they needed. So when they returned to London, they had to carry out some actual motion capture sessions to capture the more critical moves. Due to the interaction with another character, the most challenging parts of the performance were when Iron Man is wrapped up in the electrical whips and changes his zero of gravity several times. The capture sessions let them accurately block in most of the performance. Nevertheless, they needed to keyframe more or less the entire sequence. None of it relied on raw motion capture.

Sometimes for such scenes, they can work directly with the principle photography, but on this scene it was a matter of rotoscoping out Mickey Rourke, re-lensing and changing everything else in the shot. To overcome some of these problems the team developed a collaborative relationship with the cutting rooms and the previs team in LA, changing animation, shot framings, timings and creating new all-CG shots and offering a mini-cut of how the new material might play.

Vision Capture
Nick Markel, VFX Supervisor at The Third Floor, the company providing previsualisation for the movie, said that, between previs and postvis, his team produced about 3,500 iterations for the movie. Their worked aimed to let Jon Favreau see his ideas in a more finalised state and work around the FX-intense shots with better live-action shooting. Nick’s team, ranging in size from four or five up to 20, worked on the the project from preproduction through production until  halfway through post. “By letting Jon see the bigger picture, we could help the editors do more accurate intercutting. Also, by handing our previs to the VFX teams, the artists could start their major sequences even before production finished to avoid rushing at the end.”

Influences
“On ‘Iron Man 2’, as all the teams and department heads got to know each other, what they need to do their jobs, how they work and interact, they were making a lot of changes and making multiple iterations for every shot. The achievement there was in the ones that did NOT get shot, saving on-set time or expensive VFX, like optimising the production for time, money and talent,” said Chris Edwards, Creative Director at The Third Floor.

Influences from all collaborators were incorporated - production designers, art department working on characters, environments, testing and structures they plan to build. The previs team examines all of these inputs and starts to build them in Maya in 3D, their core application for characters, environments and props. Motion Builder is their tool for virtual production.

Real Time
The director can view the shots and indicate the changes he wants, such as character positions, light and shadow. The goal is to be in complete control of ‘blue printing’ the whole movie as assigned sequences. With this animated previs, the editor can cut together storyboards and this edit receives the dailies from the shoot, to be replaced later with the finished footage from post.
To be able to change shots on the fly and adapt them into the sequences that will ultimately be shot, the team works in real time, doing less software than hardware rendering, or play-blasting as in Maya. Hardware rendering happens in real time, like a video game, and render quality depends on the graphics card. Chris said, “Because we are working with directors, we have to limit previs refinement to what is achievable in real time. The asset builders model, texture and light everything so that the real time look is as compelling as possible – effects, rain and weather.

Choreography Design

“Lighting wasn’t quite ready for ‘Iron Man 2’ previs but is something we would like to refine enough to get usable information about lighting scenes,” said Chris. “Reproducable conditions are the goal because lighting can be difficult to organise on set, so we are experimenting with presets that approximate real lights. The more realistic the previs, the better the DP will know how to move through scenes and compose the shots.”

Nick said, “One of our strengths is in stunt choreography, hand to hand combat and helping live actors interact with effects – getting the eyelines and timing right, for example. The final climactic sequence, in which Iron Man and his partner, also in the armour, take on an army of remote controlled drone robots, was especially demanding for this kind of interaction.

“Our earliest view of shots was usually in board form, passed on to us to start creating the sequence. Then the crew went and shot all the plates for the environment and then we would move to postvis. During production, we were handling the visualisation of the sequences right through and finally handing it off to the VFX team.”

Virtual Camera System
Superhero movies are good example of the demands on actors to ‘act’ in a virtual scene, pretending to interact with a virtual world they cannot see. Virtual production, or on-set visualisation, tries to close the gap between concepts in preproduction and the results in post. Historically, on-set production was something pre-production teams didn’t know much about. Very little digital technology has been used on set.

‘Iron Man 2’ aimed to put the virtual camera in the hands of the DP and allow him to shoot inside the previs world. “We weren’t quite working to the level reached in ‘Alice and Wonderland’ which was essentially in real time – shooting live actors while seeing the virtual environment they were to appear in – perfecting that is the current goal for on-set visualisation.
“In this movie, however, we were figuring out how to engage the filmmakers – the director and the DP – to let them interact with the virtual world, much like creating a custom video game for every beat of action in the movie.

Action Beats
The company has been using Xsens MVN motion capture suits, which are wireless and don’t require a camera system and can be used on set or other production environment on the real or stunt actors. Their motion data will be captured on a computer, and can be given to the stunt choreographer – as they did for ‘Iron Man 2’. Nick Markel, VFX Supervisor at The Third Floor, would collaborate with the stunt choreographer and his team who had already blocked out the action in advance.

Nick would capture these performances as accurately as possible and recreate all of these action beats, two characters at a time, inside Motion Builder to save as master scenes. These scenes represent the action that will be shot from multiple angles. The master scenes are loaded up, played in real time for the DP, who then continues the process with a virtual camera system.
“This is like picking up a monitor and being able to look into it and see the virtual world while you move it around the room you are in.,” said Chris. “You can look backwards, forwards, get low, high, and scale yourself as a giant or a tiny mouse – as well as place yourself anywhere in the scene we have created.

Exploratory Shooting
“The DP is reacting to real actor performances, and can pick up a physical camera and shoot ‘normally’ but within the virtual world and can replay the performances over and over in exactly the same way and prepare for shots they need to shoot live. DP Matthew Libatique really embraced the exploratory process of shooting the actor to get the master shots, heads and tails, different angles and coverage of the action.

“These shots would go to editorial, where everyone involved could see the clips cut together and start refining the shots with pans and zooms, all prompted by seeing in advance how it would look. Then they could capture the additional footage to craft the movie with.”

VFX Postvis
When the first plates come back from the set, the editor looks at the footage, narrows down the takes to use and decides roughly how to cut them together. For an action movie like ‘Iron Man 2’ with a large number of takes will need VFX work, the previs team can make the enhancements rapidly as a rough pass for the editor, called postvis, and animate the performances, characters, creatures, explosions and environments.

“First, plates are tracked through 3D tracking software like Boujou or, more lately, Pixel Farm’s PF Track. Then we use a virtual camera that matches the live action plate and start comping in the previs elements. This provides editorial with more footage that looks more like the complete, original vision, built into the previs and edited down to the frame. Directors can show sequences laid down this way to a studio or test audiences to make sure they that they have a sequence worth post-producing. The cuts can be sent to potential FX vendors, so they can make leaner, more accurate bids for jobs and organise resources.

 
Day at the Races
Tony Stark, inventor and character behind the Iron Man armour, owns one of the cars competing in the race. He suddenly decides to pilot the car and race it himself. But with the race well underway, his arch enemy Whiplash walks out onto the track disguised as an official, armed with a pair of lethal electric whips he uses to slice into the race cars.
Tony’s trusty chauffeur and his right-hand girl, Pepper Potts, see the attack on TV and jump into his Rolls-Royce, driving onto the track to rescue him as the race cars stream past. Stark escapes from his own car’s wreckage as the race cars pile up on the track. Whiplash closes in, trapping Tony amid flaming wreckage. The Rolls-Royce zooms around the bend into Whiplash, pinning him against a barrier but he soon recovers and slices the Rolls into pieces before Tony can escape.
Pepper throws Tony a suitcase containing his portable Mark V Iron Man armour. He activates the suit, transforming himself into Iron Man, and tries to bring Whiplash down. But his opponent tears into the Mark V armour and wraps his whips around Iron Man's neck. Iron Man tangles himself further into the whips but finally manages to disable their power system and put Whiplash out of commission.
Iron-Man-14Iron-Man-15Iron-Man-17
Creating Options
Chris Edwards views previs as support for directors from early in preproduction to passing off the last shot from previs into the VFX pipeline for post. “The software used now for previs can pull practical technical data from planning sessions and check values for feasibility. For VFX, for example, the data can help them make decisions on which shots can be live action and which should use digital VFX. To some extent it can replace models, storyboarding and figures with more precise information, and it can be a medium that works for art direction, effects, editing and cinematography.

“Maya and Motion Builder files are created, the common file format used in FX studios, so these can be directly handed over to artists. We produce different views of the set, charts, graphs, measurements, illustrating what the camera can cover from different positions." For example in ‘Cloverfield’, they plotted out a complex sequence of handheld camera work that looked spontaneous but was highly choreographed between the actors’ moves and the cameraman.
Nick Markel says Motion Builder works best for integration with motion capture data in our virtual camera system, to capture shots needing an authentic ‘hand held’ shooting feel. They use the camera in Maya for the smoother, flying type of shots.

Words: Adriene Hurst
Images: Courtesy of Paramount Pictures
and Double Negative
Featured in Digital Media World. Subscribe to the print edition of the magazine and receive the full story with all the images delivered to you.Only$77 per year.
PDF version only $27 per year
subscribe