The Future Group Opens a World of Mixed Reality for Live Production
The Future Group's mixed reality production hub, Pixotope.
The two worlds of live production and computer gaming are colliding, entirely changing the potential of live entertainment. In particular, new content possibilities have emerged using virtual environments and augmented reality.
The team at The Future Group in Oslo, Norway, has gained recognition for their involvement in AR productions such as the interactive mixed reality TV show ‘Lost in Time’, The Weather Channel's ‘Explainers’ and Riot Games’ ‘League of Legends’ world championship show. Recently, they have undertaken an AR journey of their own by evolving the business to help other producers take advantage of new opportunities.
“Until recently, we have been extensively involved in generating creative material used in AR events and productions of a type that no one had attempted before,” said The Future Group’s CEO Marcus Blom Brodersen. “These experiences brought us face-to-face with the technical challenges that AR and MR projects present, leading us to invent our own systems and tools that evolved into our product called Pixotope – essentially a mixed reality (MR) production system that brings together all the various parts of an MR production.”
Pixotope takes French TV channel TF1 on a Voyage to Mars.
Going forward under a new management team led by Marcus, Chief Creative and Product Officer Øystein Larsen and Chief Commercial Officer Hans Svilosen, The Future Group has now re-focused to concentrate solely on developing Pixotope and supporting creative clients with their technical MR expertise. By partnering with pioneering clients and helping them take advantage of promising live production opportunities, they hope to help shape the new virtual horizon.
Encouraged by what the industry has achieved so far, they are keen to take advantage of new opportunities based on the ability to produce high quality, interactive, CG imagery in real-time, which can then be combined with live action presenters and actors. From setting stories in large scale virtual worlds, to incorporating interactive real-time virtual characters in live settings, live / real-time digital effects are changing producers’ ideas about the relationship between programming and audience engagement.
Computer games have used real-time interactive CGI for some time – what has changed rapidly in recent years, however, is the fact that they now display near-cinematic quality images in real-time. Marcus said, “A key driver for this has been the use of game engines to handle graphics, interactivity and sound. The two most popular game engines are Unreal Engine from Epic Games and Unity from Unity Technologies.
“As these game engines have become more efficient and started using the power of purpose-built graphics processing hardware, the resulting ability to generate such high-quality imagery in real-time creates very interesting possibilities for live programme and events production.”
CEO Marcus Blom Brodersen
Chief Creative and Product Officer Øystein Larsen
Chief Commercial Officer Hans Svilosen
Tuning Up the Game Engine
Nevertheless, artists like those at The Future Group can’t escape the fact that game engines are not made for studio creatives. They reside squarely in the domain of programmers. Neither are game engines designed for the rigours of live production, where rapid incorporation of new ideas and the ability to immediately react to the unexpected are the basic requirements for any live system.
“This is why we designed and created Pixotope, to make all the functionality of the Unreal Engine available to people working in live production. As our complete virtual production hub, combining all the elements involved in mixed reality production, Pixotope takes a live video feed from a camera and allows it to be used as either a live background or as an insert, using either an external chroma keyer, or the system’s integrated chroma keyer.”
It also receives tracking data from familiar tracking systems so that a virtual camera built within Pixotope will match a real-world camera. It can apply live motion capture and facial capture data to a computer-generated character. In short, its functionality is based on the concept of connecting elements and assets to almost any kind of data input to allow characters, graphics or whole scenes to react to external data-driven events. Multiple Pixotope systems and cameras can collaborate to replicate a multi-camera live studio environment.
Mixed reality platform for ‘Dota 2 Tug of War: Mad Moon’ tournament in Kiev, Ukraine.
Flexibility of Real Time
The entire team’s combined experience has demonstrated that flexibility is crucial to producing mixed reality. “Live shows result from interactive iteration, and Pixotope is designed to permit rapid creative evolution so that everyone, from client to cameraperson, and presenter to producer, can see and modify augmented results immediately, in real time,” Marcus said.
“Our mission is to support, enable and inspire producers to explore and take advantage of the creative opportunities mixed reality productions open up. By taking real-time game engine functions as far as possible, expansive, diverse continuously evolving environments like those in the highest ranked games can be developed and incorporated into live broadcasts and events. Into these new worlds, live action talent, data-driven elements and augmented characters can be composited.”
Furthermore, beyond familiar newsroom and sports broadcasts, The Future Group envisions greater use of virtual production in applications ranging from children’s entertainment to episodic TV and to game shows, because it permits setting stories in non-real environments without relying on extended post-production timescales and large budgets.
Alternative uses for the data-driven aspect are also interesting. Currently, the term ‘data-driven’ is generally associated with stock tickers, news headlines and weather statistics, but it could equally result in live audience interaction with real-time augmented reality characters via twitter hashtags. “This leads to intriguing possibilities concerning crowd-influenced storylines and personalised content,” Marcus considered.
“Of course, the E-Sports market has huge potential for AR. Rather than viewers watching from the sidelines of an E-Sports competition, virtual production takes the viewer inside the game scenes, to customised points of view to get right inside the action. Presenters can also appear to carry out the broadcast from within the game environment itself.”
Marcus regards the possibilities as almost endless, and believes the interactive potential is what could unlock a whole new genre of live entertainment.
Engaging the Audience
It has been very exciting for The Future Group to see their customers using the Pixotope platform to win ratings wars and engage audiences by telling their stories in different, creative ways. For example, The Weather Channel has created live broadcasts with their presenters immersed into high-energy scenes of floods, tornadoes and snowstorms. Meanwhile, Riot Games presented the Pro League regional finals for their game League of Legends in front of a live arena audience, while augmented virtual characters were singing, dancing and talking to interviewers live on stage.
“Audience engagement is among the main drivers for broadcast innovation,” said Marcus. “The teenage/young adult demographic is still hard to target as the younger generation grows less and less inclined to watch mainstream, linear TV. Virtual production techniques help to move broadcast and events into new spaces. For example, having interactive live action take place inside a successful computer game environment and delivered via game-centric streaming services such as Twitch.
The Famous Group used Pixotope to create a Baltimore Raven swooping over the action at the stadium.
“Virtual productions can capture the imagination of an audience and dramatically increase audience engagement through the creation of cut-through moments. Another customer of ours, the similarly named The Famous Group, created an augmented raven for the coverage of the American Football Conference Northern Division Final, which the Baltimore Ravens won. The augmented clip was re-tweeted a staggering number of times.”
Marcus’ vision for the future of MR depends on the enormous potential for mixed reality productions to attract a high level of engagement. “That ability will surely drive innovative producers to create more and more virtually originated content,” he said. “Increasing competition from multiple streaming providers and direct-to-customer feeds, requires every content platform to work harder to stand out. Interactive virtual productions can help form strong brand recognition and form the foundation of valuable intellectual property.” www.futureuniverse.com