Digital effects supervisor Ken Hahn tells how SPI’s digital teams sent Agents J and K back through time and to the top of the Apollo 11 lunar mission in ‘Men in Black 3’.
|Sony Pictures Imageworks started work on Men in Black 3 in June 2010. Their initial involvement included camera testing and decisions on whether to shoot digital or on film, stereo or mono. SPI’s VFX supervisor Ken Ralston and 3D VFX supervisor Corey Turner tested various camera set ups with the DP Bill Pope and director Barry Sonnenfeld.
They opted to shoot on film and use stereo post conversion, deciding that this scenario would work better on set with the director’s and DP’s shooting style for the movie, and that a stereo conversion would give them more flexibility in post. Of the project’s total 1214 visual effects shots, some 650 were completed at Imageworks.
Digital effects supervisor at SPI, Ken Hahn, joined the project later in November 2010, by which time the production had produced rough treatments but no firm script for the movie. However major decisions had been made about the production that included using digital characters to replace two of the lead actors in certain scenes. The VFX team would also need to build digital assets of New York City and Shea Stadium, and a re-creation of the Apollo 11 lift-off at Cape Canaveral. Getting started early, before the story was finalized and the shoot began, allowed time for R&D and research of the locations.
“Comprehensive on-set data acquisition was particularly important for this project,” said Ken Hahn. “I wasn’t working on set myself but our on-set team captured HDR images, LIDAR scans, and a lot of photography. For the jump-back-in-time 1969 sequences, the production could provide a certain amount of information because they had to build a small physical portion of the set for the scene at Shea Stadium, torn down in 2008, which our team extended and built out.
“Getting accurate information about the stadium was harder than expected. There were no blueprints and even the photographs revealed a lot of modifications over the years. So we focussed on getting the right dimensions, footprint, levels and rows of sets. When we retro-fitted the physical set used in the shoot into our digital set piece, we only needed to make minor adjustments to accommodate the camera angles.”
Shea Stadium was just one example of a set in which Barry Sonnenfeld wanted to the team to ‘cheat’ the design and camera position slightly to add some drama. For example, the angle of view from the upper deck wasn’t quite steep enough. To create the more exciting, eye-in-the-sky style shots he was after, the seats were more steeply cantilevered.
“Our digital version of the Chrysler Building, which still exists, was easier to construct,” Ken said. “Blue prints and overall dimensions were readily available and parts of it could be LIDAR scanned, although it was too big to capture the whole building. Again, Barry wanted to cheat the dimensions to produce more grandiose views. The actual size is 1,000ft, while in some shots it stands 1,500 or 3,000ft high. Nevertheless, we always started with the perfectly accurate model and modified it from there.”
Fortunately, lots of reference was available from NASA for these key sequences, for both the physical set and digital build, which was 400ft tall. VFX Supervisor Jay Redd also went to Cape Canaveral and shot from the gantry of one of the Space Shuttles, capturing 360° imagery for the numerous matte paintings Ken’s team produced. A live shoot took place on a beach in Long Island as well, chosen to stand in for the beach at Cape Canaveral, but the beach area was quite small and needed considerable extension work to resemble Florida.
Ken said, “Every shot across the Cape Canaveral section of the movie involved VFX and digital work. Nothing could be captured completely in camera. We had green screen shots, set extensions, and post-production decisions were made about camera angles and shot composition, which in the studio meant several more all-digital shots than we expected. Sometimes the captured footage could only be used as reference to digitally re-create shots closer to what the director wanted.”
Time was working on their side in a couple of ways, however. First, the shoot was completed in two parts. After the first stage in December 2010, the production took a break before resuming from March to June 2011. Also, fairly early on as soon as Jay Redd and Ken Ralston started work in New York and recognised the space constraints on the sound stages, they had alerted the team in the studio that the scope for digital environments would grow.
Through the Pipeline
One of Ken Hahn’s main responsibilities was pipeline design. He explained, “While SPI has a standardised, robust feature film pipeline, it is also designed to be flexible and extendible per show. The software relies mainly on Maya for modelling and animation and Katana for lighting, which was developed at SPI. Our team has been testing and using it for seven or eight years. This is an advantage of working at a large studio with enough artists to spend time refining tools. For example, up to now we were also using Katana for compositing but we’ve been adapting Nuke to our pipeline and, in particular, started using Nuke tools for the stereo conversion work.”
Mari was used for texture painting for the first time at the studio, and resulted in saving a lot of time. The lead texture painter John Wallace had been keen to try this out and specifically devoted time to learning how to use it. Although none of the other texture painters had worked Mari before this, he estimates that whatever time they spent learning how to use it, they could make up for in productivity.
Most of the effects were created in Houdini 12, which the effects team had started using as an alpha product to take advantage of its dynamic simulations. For ‘Men in Black 3’, large scale fluid work was needed for the Apollo 11 lift off for the volumes of flame and smoke. The bullet solver now integrated into Houdini also proved to be useful on the production, and Ken mentioned that having GPU acceleration made a notable difference across their effects and simulations.
Open Shading Language
Lighting set ups were not always straightforward, because the light from plates, green screen sets and live action was often inconsistent through each sequence from beginning to end. He said, “As usual we let the HDR images from set take the lead but from there, input from the artists, CG supervisor and VFX supervisors was used to highlight the action or the particular focus in the images, requiring aesthetic choices as well.
Several scenes needed digital doubles of the actors playing agents J and young agent K, and also Griffin, one of the aliens. “We were able to identify early on where we’d need to intercut, shot-for-shot, from live action to CG. But as the work proceeded we found that we sometimes needed to transition from live to CG within a single shot, which puts extra demand on quality,” said Ken.
“For this movie, I had just come off the production of ‘Hancock’, another movie starring actor Will Smith, and so I could re-use aspects of the digital double we made of him for that film. As another advantage, we had no speaking parts to deal with. I do believe that character performance work should always be done by the real actor – it’s what they do best, after all.
“To our advantage, in most shots the doubles were involved in stunt-type work, or helped to create those wide shots the director wanted when the camera hadn’t been able to move out far enough on the sound stages. Getting the actors, sets and lighting captures, scanned and recorded as thoroughly as possible while they were available was crucial to achieving high quality within the time and budget we had. This was no different to any other film – once those things are gone you don’t have another chance.”
Production supplied the characters’ costumes to help the cloth team develop a very close physical match for the clothing characteristics. “We watched the footage very closely for behaviour of the layers of clothing as it responded to the dynamics of riding on motorbikes in the wind, or buffeting in the air as Agent J jumps off the Chrysler building, for example.
Animation supervisor Spencer Cook, animator for the doubles, helped on refinement as well and was open to trying techniques beyond keyframing, including motion capture and Endorphin, physically-based software for simulating the way bodies react during stunt work.
“We met the usual problems of extrapolating a 2D image to 3D. A lot of discussion took place between the modelling supervisor Marvin Kim and the modelling team, Barry and Ken Ralston regarding eyes, whiskers, the precise fin orientation, number and placement, fat vs muscle – in short, deciding exactly what this character was going to be. Designs and proportions ranged from walrus and seals at one extreme – fat and loose-skinned – to a large sleek, muscular game fish like a tuna, to shark-like. In the meantime, Spencer began deciding on pose and orientation, which helped us refine the sculpt.”
Sorting out the design work still left the textures and surfacing to develop. John Wallace’s colour palettes and the CG team’s work experimented through slimy, scaly and skin-like qualities, the patterns and iridescence – it involved tremendous iteration between teams. Ken commented, “It’s interesting that a lot of decision making of this type was done well after principle photography. The fish sequence in particular had been shot using only rough animatics prepared at The Third Floor, without a firm idea of what the final product would be for the fish itself, either in looks or behaviour.
“We had a few constraints based on plate photography, as a guide. But the point for us is that these decisions all used to be made well up front, in advance, whereas now they can be pushed into the visual effects stages. The VFX teams have design and storytelling techniques like post-vis and digital photography to accommodate post-production decisions. Camera moves in all-CG shots may include the tasks the DP, focus-puller and camera operator perform on set to support the cinematographer.”
Curtains of Time
While time travel was a major story element for this film, once again, the script offered few other clues than “Agent J falls through a curtain of time . . .” Agent J secures a device enabling time travel but must use it while falling from an extreme height to achieve the critical speed required for it to work. He chooses to leap from the top of the Chrysler Building. Millimetres before hitting the pavement the device kicks in and transports him to 1969. As he plummets, numerous intermediate time periods flash past.
Visualising this ‘time jump’ was a major step. Storyboards and pre-vis involved a degree of prototyping, but it was only developed to a crude stage. What Ken Hahn’s team received from The Third Floor as previs on this was basically colours passing through frame before the action resumed in the destination time period.
Consequently, much of the decision-making for effects in the time travel sequences fell to the visual effects teams again. They studied some of Barry’s previous movies, like the earlier ‘Men in Black’ films and ‘Wild, Wild West’, looking for likely themes. However, this kind of idea was new for him.
From its initial concept, calling for something protracted, complex and intricate, they needed to devise a much shorter, more dynamic shot. “It didn’t get much time in the film,” Ken said. “From several design ideas we put forward, Barry responded best to the idea of the Aurora Borealis. But this didn’t present an easy solution. Because the Aurora is always seen over a black night sky, it meant somehow creating a distinctive light emitting effect, suggesting excited gases, in broad daylight.
Gone in a Flash
|To reflect, Ken Ralston had alerted him as early on as possible that many shots across the film would pan out this way – they should be ready to build a huge variety of effects that would very likely only be used once, risking not being noticed at all. “For example as Agent J tumbles down to street level, the carefully constructed time periods flash by so fast that myriad period details we incorporated may never be appreciated by the audience.
“It’s not the most efficient way to work – of course we like to build something and then use and re-use it many times to ensure its soundness and get more value from it. The recreation of the Apollo 11 structures and Shea Stadium were well used, others were abandoned after a single shot. We just had to be ready to produce very high quality work under both scenarios. We had our heroes and our unsung heroes.”
NEXT WEEK‘Men in Black 3’ returns, focussed on Animation Supervisor Spencer Cook and a description of the process behind the movie’s alien characters, combining live action, prosthetics andanimateddigital extensions.
Words: Adriene Hurst