Earlier this year, Foundry completed a two-year project titled Enabling Interactive Story Telling (EIST). Working with the BBC R&D team, the project's goal was to develop tools that would help a wider group of people from different backgrounds and industries – with different sets of skills and interests – take advantage of immersive media.
AR viewing and VR headsets have continued to advance and now need immersive content with high production values that is useful and interesting to people who are more familiar with the medium. Up until now, VR experiences, non-linear stories and interactive storytelling have needed specialised tools specifically designed for either filmmaking or games development, and often require large production teams. In terms of funding, these factors have generally limited the applications to media & entertainment projects.
Through EIST, Foundry aimed to extend immersive experiences and production beyond M&E markets to sectors ready to use AR and VR both to train and inform their own employees internally, for example, and to build user experiences that impact consumers. The architecture, engineering and construction industry, tourism and education are a few examples.
Animation sequencing and camera blocking
Game Engines and Pipelines
EIST explored ways to generate content from CGI or live action footage – or both – faster and more easily for businesses considering the potential to improve productivity or generate new business models using AR or VR. Foundry, led by Head of Research Dan Ring and Senior Research Engineer Niall Redmond, and the BBC R&D team’s approach was to develop a scene layout and sequencing tool containing non-linear media objects with native support for Universal Scene Description (USD). They added support for multiple renderers as well, including real-time raytracing, so that media types such as 360° video and light fields could be compatible with the tool.
Before starting on EIST, the research team had a look at what content creators were already using to tell interactive stories. Game engines were the most common, due to their real time rendering and accessible tools for building complex virtual worlds and stories. They are especially useful in the early stages to help visualise the project and make initial plans.
Layout tools
However, it is generally not possible to integrate game engines into the pipeline. As a result, decisions and changes made within game engines have to be tracked and shared with a team through another medium and data needs to be re-created, creating a loss of continuity. This realisation revealed to the team that an immersive production, from its very beginning, needs a dedicated, real-time production hub at the foundation of the pipeline.
Layout, Playback and the Real-time Hub
From that point, EIST was devoted to developing a real-time scene layout and playback review tool as an alternative to game engines in creating interactive stories, while keeping the pipeline as efficient as possible. The team used the Oculus and Vive systems to collaborate in VR in real-time. Dan said, “The tool could be used for basic scene editing and quick playback review, or for more heavyweight work like sequence creation. Combining these types of real-time operations gives us the beginnings of a real-time hub.”
Using USD and FBX files, the editing tool creates an environment based on animatics, which can be iterated on during production. As assets are completed and detail added, they can be layered into the scene and are seen immediately in the visualisation. This supplies the essential continuity, both in the environment and regarding decisions, that is lost when using game engines, and prevents having to replicate work.
Attribute editing and material mapping
The result is a structured, logical way to capture and gather all data together, making sure it is timestamped, synchronised and arranged into a format, and delivered consistently to artists downstream in production.
“The EIST tool is built using USD as the backbone,” said Dan. “This means that all edits are done in native USD and the result of any scene changes will be one or more USD files. This way, we can pass the benefits of working with USD directly to the user, including storing decisions, assets and other elements that determine a scene, and overriding updated elements in a non-destructive way.”
USD Composition Arcs - One Story, Many Endings
The ability to incorporate non-linear media objects, mentioned at the start of this article, was important because the media in interactive experiences might be something like an animation with multiple possible endings, or an environment that can change dynamically depending on the actions of a user. Niall said, “Certain composition arcs in USD were particularly suited to working with media like this, which meant we could design easy ways to dynamically change scene content.”
Using instancing to populate a scene
USD composition arcs are very complex but useful because they define different ways of combining USD files to produce a final scene. However, it means that when you load a USD file, everything you see in the resulting scene may not be contained within that particular file. Instead, it just shows that the USD file may internally point to many other USD files, which can be combined in different ways using different elements.
“This has many advantages when creating and editing a scene,” said Niall. “In particular, multiple artists are able to work on the same scene simultaneously, since they will be working on a file which is only a subset of the whole scene. Composition arcs are the different ways of creating these file aggregates. There are six types of composition arc – for example, one type will import all information from another USD file, while another type allows you to take only a part of the file. What we've done in EIST is expose these very effective concepts so that a user can create a USD scene composition fairly easily.
“Consequently, 'Layout' for EIST means changing the content of a USD scene – that is, adding objects to a scene using the USD composition arc idea – or moving and scaling existing objects. 'Sequencing' means adding animation and camera moves, to create one or more shots, and EIST does allow you to edit and save keyframe and camera data to produce finished sequences.” www.foundry.com