Pixotope Hosts Super Bowl Heroes with Live Mixed Reality Opener
As an opener to the Super Bowl LIV, NFL revealed its ‘100 All Time Team’ by staging a seven-minute mixed reality pre-game presentation at the Hard Rock Stadium in Miami Florida. The production combined and augmented live action from multiple moving cameras, with virtual stages and 3D graphics for fans of the 2020 Super Bowl, attracting an estimated domestic audience of 100 million, plus international viewers.
To achieve their presentation, NFL engaged a fan experience company The Famous Group to design and choreograph the pre-show content. Their brief was to celebrate the history of the sport and the contribution of 100 famous players and 10 coaches, using graphics, historical film and live shots of attendees. The Famous Group chose to achieve this presentation through the use of mixed reality virtual elements.
Matching the Stadium Environment
As their content creation software, they selected Pixotope, a virtual production platform made by developers The Future Group from Norway. The Future Group became a major contributor to the production and delivery of the NFL’s show, supplying six installations of Pixotope, plus the technical expertise needed to liaise with all other aspects of the production.
A number of virtual assets such as stages and screens had to be designed to exactly match the stadium environment that they would augment. Cameras, one of which was a Skycam camera controlled on wires, needed to be tracked in real time as the event proceeded and have their lenses aligned so that the virtual elements would be generated at precisely the right angle to accurately match real-world positions, moment by moment. Lighting also had to be measured so that it could be simulated in the virtual world to result in the correct shadows and highlights.
Pixotope uses the Unreal Engine to generate photorealistic renders in real-time. Designers use the combined systems to create virtual elements and augment content with looks ranging from terrain and foliage and particle systems to simulated camera and lens properties. Pixotope is built for on-air deployment of mixed reality content so that a user can control a virtual production from a single user interface.
In-stadium screens, which would show the augmented content to the crowd, had to be accurately plotted in the virtual scene so that they could be blocked out to avoid visual interference with the live scene. All this had to take place under precise time control to synchronise with the voice-over and contributing performers in the stadium.
Pixotope product specialist, David Stroud, was part of a team of mixed reality technicians, helping to specify and install the required systems and gear, and bring together all the constituent departments. The Future Group team were involved from the outset, initially collaborating using a shared virtual project to calculate and give back technical assistance to The Famous Group creative team.
The software carries out HDR 32-bit linear colour processing and rendering, working with formats up to UHD quality as part of standard content generation workflows using FBX, Alembic, openEXR and so on. Internally, it has a chroma keyer, GPU-based video processing for colour correction, mask adjustments, and image effects, and non-destructive compositing to protect the video from graphic system anti-aliasing.
At the 2020 Super Bowl
Eleven days ahead of the main event, the team took up positions at the Hard Rock Stadium. David said, “Although the virtual scene had been set up offline from scans and measurements from the stadium, there are inevitably adjustments that can only be done on site. We used our Pixotope virtual production platform to receive images and tracking data from each of the cameras, including the Skycam, and composite The Famous Group’s virtual elements onto that footage. This allowed everyone to see the augmented results in real time, making adjustments and improvements as we rehearsed the show in the days leading up to the Super Bowl.
“We helped the camera operators visualise how the preshow would come together, as understandably, they were more used to following the unscripted action of a football game than reacting to very specific time cues. Pixotope has the capability to continuously output the final result, even while editing the graphics – a great tool for directors following rapid changes and updates. All of the assets and the overall design are always live.
“We also worked closely with camera tracking specialists SMT, and in fact, we were able to improve our data exchanges to tune the virtual camera motion further. We worked with both Fox the broadcaster and the companies supplying in-stadium displays to make sure we had as much information as possible to refine the visual quality.”
Recent developments incorporated into Pixotope 1.2 form a true scene-referred linear compositing workflow to improve realism in video and 3D graphics composites – that is, AR compositions. A scene-referred approach improves shading and lighting, motion blur and image distortion, anti-aliasing and semi-transparent elements like hair, volumetrics and other FX elements. Useful in this projects, the new workflow includes fine global and per light shadow controls, luma masking from plate to avoid double shadowing and ambient offset to avoid hard shadow intersection. www.futureuniverse.com