Sarofsky Scales up the Marvel Universe for ‘Ant-Man’ Titles
Design and production company Sarofsky in Chicago created the CG animated main-on-end titles for Marvel's ‘Ant-Man’, continuing the team’s titles work for Marvel’s earlier movies, ‘Guardians of the Galaxy’ and ‘Captain America: The Winter Soldier’.
Sarofsky delivered the two-minute ‘Ant-Man’ sequence at 4K resolution, in both 3D stereo and 2D versions. Viewers take the point of view of a camera as it descends from above the earth down past and through familiar places and objects in our world into microscopic worlds and finally to subatomic particles. All of this is visualized as a dark luminous universe in which we descend from one field of glowing shapes, down deeper to others. Credits appear framed inside boxes that are locked with the camera's POV.
For Sarofsky’s team of designers, animators and VFX artists, this project was a journey through everything from native stereo 3D production to CG animation, and from the stratosphere to subatomic particles. It called for an efficient CG pipeline using MAXON Cinema 4D as the primary modelling and animation application, rendering out multi-layered passes to use within NUKE to create and control a distinctive, glowing Marvel look.
All About Scale
While the titles reference the famous ‘Powers of Ten’ short film by Charles and Ray Eames in 1977, their main inspiration draws directly from ‘Ant-Man’ and the Marvel Universe themselves. "For a film called 'Ant-Man’, you might expect a main title to incorporate ants," said Erin Sarofsky, owner and Executive Creative Director of the studio. "But one our favourite aspects of the film is how subtly it switches scale, leading us to explore a few concepts that were exclusively about scale.
"We also wanted to develop a look that was unique in the Marvel Universe and connected visually to the film without feeling like just another scene in the movie, where we could depict the macro and micro worlds in the same visual language. The production design for Marvel’s Pym Particles, contained in a glowing red liquid, led me to research fibre optics and other types of lighting, specifically how textile artists weave that lighting into clothes. That became the inspiration for the look, and everything flowed from there."
Pym Particles are subatomic particles of an extra dimensional nature that are capable of shedding or adding mass, reducing or increasing the scale of any form of matter. They can compress physical forces around the objects or organisms that they are applied to, and increase the density and strength of the subject.
Sarofsky’s main building material for every asset in the titles is fine translucent filaments filled with the slowly flowing, glowing red particles, adapted to shape the coastlines seen from above, aircraft, buildings, trees, grass, single-celled creatures, soil - down to the shapes of subatomic particles themselves.
The page you are looking is not published
Having a unified look was important to Erin, but meant developing something that would work at all scales. She studied scaled architectural models and abstract renderings of objects that were wrapped or encased, highlighting their form but in a totally different material. A singular look also helped them define their production process faster, and move on to the details like line thickness and how quickly the particles inside should move – all of which were determined right at the beginning of the motion test phase.
During the R&D phase, their tests covered options from particles with tracers to auto wrap tools that would algorithmically wrap paths around an object. In the end, simple splines wrapped by hand became the best way to create the organic shapes they were looking for. VFX supervisor Matthew Crnich said, "We were producing the 'Ant-Man' titles imagery natively in stereo, and that presented some hurdles in itself. But defining the structure of the 3D geometry as splines became the biggest modelling task on this project."
Sarofsky's artists have become experts at using Cinema 4D for projects of this scale. Creative director John Filipkowski said, “We wanted the splines to feel full of life and energy when travelling through the scene. While testing, we came upon a simple way to create the look of the glowing filaments on the splines by animating some of the material parameters. This material produced beautiful results, but turned out to be very time consuming to apply it to the objects in each scene. Each object had to be addressed individually depending on how fast the camera moved and how large the scene was. Only after the artists really dug in and knocked out a scene or two could they really get the hang of it.
“Because we constantly needed to change the look of the splined objects, animating parameters was also the most flexible option. For some scenes, it was the number of splines, in others it was the thickness or length, and in some scenes, all of these variables had to be animated as the camera moved by. Cinema 4D came through as a great tool for this type of work by allowing us to make adjustments non-destructively with splines and sweep nurbs.”
Lights and Looks
As the lights and lighting are critical to the whole piece, all three programs - Cinema 4D, NUKE and Autodesk Smoke - were essential to achieveing the final look. Cinema 4D was used to create and light the splines, before rendering out multi-layered gray scale passes specifically for use within NUKE where all the passes were brought together to create the colour, glows, look and feel you see on screen.
Because they set up the NUKE composites to work procedurally, each new render from Cinema 4D would automatically generate a closer approximation of the final look. “By generating the design and animation within Cinema 4D but creating the look within NUKE, our team had the flexibility to make significant changes within the composite rather than going back to 3D,” Matthew said.
“Every time we piped in the black and white passes from Cinema 4D to NUKE, the resulting fully-coloured and lit scene, looking drastically different from where we started, never ceased to amaze me. Smoke was then used to assemble the project's 16 scenes in stereo and to work on compositing and grading with real-time playback,” said John. Smoke artist Cory Davis assembled the nightly renders for review every morning, allowing everyone to see the latest versions in real-time in Sarofsky’s finishing suite, which has a large colour-calibrated 4K monitor that can display stereoscopic imagery.
Native Stereo 3D Production
The project’s native stereo production made planning one of the most important aspects of the project. “Before any pixel was ever rendered, the team dedicated time to discussing this show’s production pipeline,” Matthew said. “We established the precedent of rendering three cameras from Cinema 4D.
Whereas the 3D version uses the left and right eyes, a centre eye was also used to create the 2D version. Cinema 4D’s built-in stereo workflow took out a lot of the guesswork. Using its native Stereo Camera, they output all three passes separately from one camera, which became an important time saver, and could quickly adjust eye separation, placement and zero parallax quite easily. From there, they just output stereo frames and piped them into NUKE for output into Smoke.
“Using the centre eye is not specific to Cinema 4D or even the conventional stereo pipeline,” Matthew said. “I think it is pretty specific to our productions, because as a design company, we are very focused on composition. We decided that the offset, which the left eye and right eye utilize to generate depth, was unacceptable for our 2D master, and that the best method for generating a 2D master that truly represents how we want the scenes composed was with a ‘centre’ eye that does not have a stereo offset.
“The three eyes were first brought into NUKE and merged into a single tree that allows the artist to work on one eye, but apply the changes across all three. The compositors then sent their renders to Smoke. Cory established a similar workflow to accommodate receiving three eyes per scene, and edited three timelines in unison. He would assemble one eye, then apply changes across the other two. More critically, he performed the final grade, unifying the colour across the entire piece.”
Because the offset dictates whether the picture has positive or negative stereo depth, the amount of that depth was a close collaboration with the Marvel stereo team in order to match the rest of the film. John said, “Once we matched the offset, we could control within Cinema 4D whether we wanted to add or subtract depth for each scene as a creative problem-solving task. For more troublesome scenes, we would render out each element with a different offset or animate the offset to make sure we didn't stress the audience.
“To view the stereo, the artists used good old red-and-blue anaglyph glasses to preview and adjust the offset on their screens. Seeing our whole crew wear these always made me smile because we were creating high-end graphics using an old school method, but it really is the best tool for the job. From there, we took it through the full pipeline out from NUKE and into Smoke for an accurate representation of the final image and depth using up-to-date active 3D glasses.”
The sense of the depth changing comes from the camera moving through the Cinema 4D scenes. Fortunately, only a few instances required the depth to change across frames, whenever they had to create transition points between two scenes that had varying stereo depths. For this, they developed a way to animate the depth or offset to zero at the point of transition – that is, flattening the stereo and then expanding the depth/offset as they entered the new scene.
Matthew described an ongoing state of stereo rivalry. "While our native stereo pipeline is solid, the design of this project worked against it, due to the glowing speculars and thin splines in each eye fighting against one another and creating stereo rivalry. Fortunately, the pipeline allowed us to view stereo shots early enough in the process to address the root causes of the conflict. We resolved these issues by isolating the element and adjusting the right eye to more closely match the left eye. At times, it required a separate render pass from Cinema 4D. At other times, it was fixed within NUKE.
Erin explained that while the typography inside the boxes looks simple, those elements were their biggest challenge overall. "Our title had to be exactly two minutes long and all the names are legally required to be on screen for a specific amount of time, so it was quite the puzzle to solve!" Motion designer Duarte Elvas developed the type technique, and editor Andrew Manne managed a complicated workflow that could cope with countless small changes.
The camera is in constant motion, but accelerates and decelerates at an even pace from start to finish. Each time, the world is redefined, the credits in the box change and then the moments in between are filled in with fascinating transitional frames.
Sarofsky balanced the legal with the aesthetic requirements as simply as possible. Erin said, “Changing scale every time a credit changes immediately gave us a paradigm to work within. We knew we were going to start above the earth and end one push after the last subatomic realm, in total darkness. The subatomic realm was defined in the movie, so we knew there were about five depths to that.”
They also decided at what point to place the ‘Ant-Man’ title and the star Paul Rudd’s credit - in the microscopic world, of course, about halfway through the sequence. Then they storyboarded and filled in the gaps. Keeping the camera above the ground long enough was another consideration, so they worked in a few fun moments like the plane engine, the feathers, and sinking down through the leaves of the tree.
Producing titles for Marvel’s films has tested Sarofsky’s team and every system in their pipeline. In fact, though primarily known for its design expertise, by expanding its capabilities to support this project, the facility now performs full editorial and finishing services.
“The majority of the expansion happened right when we got our first Marvel job, ‘Captain America: The Winter Soldier’. We implemented all of Disney’s Tier 1 Security requirements, which included several upgrades that now allow us to accommodate a 4K stereoscopic workflow - we’re now the only studio in Chicago that has 4K stereo capabilities. Since then and throughout the Ant-Man main-on-end production, we have installed and launched the two Smoke systems and finishing rooms and five editorial stations, and will also be adding Flame services.”
A critical part of their expansion was the installation of a high-speed storage environment for editorial and finishing. Matthew said, “This gives us the ability to start an editorial session in one room and complete it in another. Furthermore, it means that two Smoke artists can collaborate on the same project or work directly with our editors. In terms of upgrades, this makes a massive difference to our capabilities. sarofsky.com