Riot Games and Pixotope Go Live to Augment LPL Pro League Finals
Riot Games wanted to add an eye-catching augmented reality show to the LPL Pro League Finals eSports event, based on their League Of Legends game characters.
Focussing on characters was a good choice for Riot Games. To play League Of Legends, over 100 million monthly players take on the persona of one of the game’s Champion characters, each with its own back story and specific skills. Each Champion drives player affiliation, in the same way that real-world sports players attract a fan base.
The live 2019 LPL Summer Final event took place at Shanghai’s 18,000-seat Mercedes-Benz Arena, where the star of the augmented reality content was Akali, one of League of Legends’ most popular Champions, singing and dancing with real dancers on stage, before being interviewed live in front of the packed arena audience.
Playing for Real
Kevin Scharff, Production Director from Riot Games said, “We saw an opportunity to use AR to bring a live, real-time performance to one of our Champions, which became Akali’s interview with the host Candice. We found that the most compelling cases for AR characters take place in live interactve situations. There is something magical and authentic when a virtual character interacts ‘in the moment’, going beyond a scripted, pre-recorded performance. AR allowed us to marry the virtual character to the real world.”
The extraction of game characters into the real world is important to the diversified fan experience for Riot Games. Sophie Xiao from Riot said, “Virtual idols are very popular in China. Production techniques and people’s understanding of virtual idols are rapidly maturing. Our goal is simple, keep building up our virtual idols and make them feel as if they are real human beings.”
Taking Akali, a CG character, from a game and using her to augment shots from the live event required detailed interaction with show lighting, free moving cameras, recorded and live motion capture, as well as facial expression capture with lip-sync.
Augmented Reality Specialists
To support the Riot team’s work developing the character assets for the event, The Future Group produced the AR experience. The Future Group is the developer of mixed reality software platform Pixotope. Because so much 3D and animation work has to be carried out in a precise way, ahead of the actual production, the members of their team maintain a level of expertise across computer graphics disciplines, to help when creating an augmented reality event.
Consequently, they can work directly with their customers’ creative departments to bring technically complex concepts and ideas to the screen. Meanwhile, outside this kind of collaboration, they also have the capacity to suggest alternative routes and how to achieve the maximum impact from new audiovisual equipment.
For the Riot Games event, The Future Group deployed a team of five AR experts, including Chief Creative Officer Øystein Larsen and senior product specialist David Stroud to assess and solve various challenges.
The Future Group specified and installed four Pixotope augmented reality platforms. Each system generated real-time augmented elements and combined them with a live action background. There were three Pixotope systems connected to three different camera feeds, plus one more in reserve as a live backup. Each Pixotope contained a digital model of the Akali character and stage assets, as well as a virtual tracked camera which matched its real-world counterpart.
Just like traditional post production compositing, a significant challenge to solve when adding an augmented character to video is to match the lighting of the background scene when you light your character. CCO Øystein Larsen said, “Failure to achieve this accurately leads to characters that look stuck onto the video and not embedded into the real-world scene. The specific challenge for this event was that Akali was to perform a dance song under pulsing show lighting that synchronised with a music track. Added to this, projected images were mapped to the arena floor.”
Pixotope Product Specialist David Stroud said, “We used high dynamic range (HDR) site photography to accurately map the real-world lighting rigs to the virtual world created within Pixotope, where the digital Akali would be lit before being composited over the live camera shots. The arena lighting was pre-programmed and driven by timecode, as were the virtual lights within Pixotope, to make sure they were indistiguishable.”
A secondary function of lighting is the creation of shadows. The human eye is unforgivingly adept at noticing shadows have not been correctly simulated. To avoid this, Pixotope enabled the use of ray traced shadows for Akali’s interview to produce a photoreal result. Computationally time-consuming, ray tracing is known for causing long render times in post-production.
To overcome this problem for real time projects like this one, in which accurate shadows had to be produced as the interview proceeded with no delays, The Future Group developed Pixotope to use the Unreal Engine, combined with their proprietary rendering pipeline. The fact that the show was broadcast at 59.94 fps meant the calculations had to be completed and composited over every single frame of the live action extremely fast.
Øystein Larsen said, “Apart from shadows that fall on the floor, we also paid attention to self-shadows that are cast from one part of a CG model onto another part of the same model. These significantly enhance the photoreal looks of a generated image, but are often the first casualty when determining whether images can be produced in real time.
Due to the efficiency of Pixotope/UE4 render pipeline, Akali’s baseball cap correctly and accurately cast self-shadows over the face and reacted perfectly to the lights as her character moved, changing the direction of the shadows with respect to the light sources.”
Akali also had to move realistically. Specifically, she first had to dance in sync with the dancers in camera, and then later respond to a live interviewer and audience. The dance moves were directed and choreographed by the Riot team. Then an actor performed them and the motion was captured at Animatrik Film Design in Los Angeles. The movement data was combined with the Akali character model within Pixotope and produced in real-time, aligned with timecode reference to achieve synchronisation with the music track.
For the interview sequence, Pixotope combined the same model with a live motion capture stream derived from an actress performing off stage, instead of pre-recorded motion captured data. This resulted in the moments of interactivity and spontaneity that appealed to fans. For the facial performance, Cubic Motion’s Persona system streamed facial capture data live from the off-stage actress, who was also performing the voice for Akali.
Akali in a more familiar place
Going for Real Time
Because the arena was a very noisy environment, all of the motion capture and audio recording took place in a sound-proofed anechoic room about 100m away. Pixotope combined both the body and facial motion capture feeds to drive the geometry of the CG Akali model. This work, in fact, required some specific intervention from The Future Group’s technicians. When making games, motion capture data is recorded, cleaned up and stored within the game to be recalled during game play. But for live events like this, the motion capture data needs to directly drive the CG geometry live.
David Stroud described pulling together all the creative and technical partners alongside The Future Group in Manchester, UK to spend several days testing the motion capture setup prior to the live event. “We built the systems in Manchester and then worked with the actress who would perform both the voice and actions for Akali. With the creatives from Riot, we helped her work out the appropriate level of expressiveness and get used to working in isolation away from her interviewer. We also solved various technical challenges, such as synchronising body and facial motion capture data together with vocal audio.”
Kevin Scharff said, “We worked together across three continents for this experience – in Los Angeles for the original motion capture of the dance beat, in Shanghai for casting, modelling, location scouting and pre-visualisation, in Manchester for a technical test and capture of the live interview, and finally in Shanghai for the live event.”
Three Angles, Three Cameras, One Experience
Having completed the lighting and motion setups for the Akali digital character, the next challenge was to ensure that this material would be rendered from a viewpoint that precisely matched the shots of the carefully choreographed live-action dancers. Øystein Larsen said, “There were three different live action shots to match – one from a moving camera mounted on a jib arm, one on a free roaming Steadicam and one high above the arena. Each camera feed was sent to a dedicated Pixotope system, along with camera positional, directional and lens information, encapsulated in a data stream provided by camera tracking specialist, Stype.”
The UK Motion Capture Technical Feasibility team at Cubic Motion in Manchester, UK.
Each of the three Pixotope systems then created a matching camera inside their respective virtual scenes to photograph – that is, record – the animated digital, precisely lit Akali model and composite it perfectly onto each of the respective camera feeds. Those feeds were then sent to the broadcaster’s production switcher, in the same way as regular camera feeds. The resulting programme was broadcast both externally as well as shown on giant screens within the arena so that the attending audience could watch the AR experience.
Eyes Wide Open
Live broadcast and events inevitably throw up last minute and unexpected challenges to solve. The Future Group’s David Stroud recalls a moment that demonstrates the unpredictability of live events. “When it came to the live dance performance, the Steadicam operator chased and framed the augmented Akali character perfectly.
“But then after the event, he told us that the augmented reality video feed to his monitor somehow became disconnected from the signal we were supplying, so he couldn’t see the character he was supposed to be framing at all. He shot the whole performance by remembering Akali’s movements from the rehearsals!”
“In live events and broadcast situations, you need systems and people that can react quickly to evolving issues. The Pixotope system we make and the expertise we have is the product of many adventures in AR production. There’s always some unexpected element that needs to be accommodated.” www.futureuniverse.com