Among the updates for its HoloSuite platform, Arcturus volumetric video developers have released a new beta feature that connects separate volumetric video clips and smoothly blends them together. This ability opens the potential to populate virtual production scenes and sequences with real people, build live-action branching narratives without gaps and make it possible to generate original experiences for the metaverse using people instead of CG avatars.
Kamal Mistry, CEO of Arcturus, sees volumetric video not only as a way to tell video stories with a new look or style. “With the right tools, it presents possibilities that simply weren’t there before,” he said. “Our new tools will open up many new applications across multiple industries.”
With the Blend tool, HoloSuite users can start with a volumetric video clip featuring a live-action performance, and connect that clip to other, related clips, creating an imperceptible transition from one segment to another. Artists can also use Blend to loop the end of a clip back to its start, creating a recording of a photorealistic subject, moving and interacting, that continually repeats.
While the applications vary, game engine users with the HoloSuite plugin will particularly recognise advantages. A blended volumetric video clip can be inserted into a digital environment to populate a virtual scene with live-action 3D performances, without first creating a cast of digidoubles to produce a sequence manually, or trying to force 2D video into a 3D setting.
The Blend feature will also give users the ability to build live-action branching narratives without perceptible change between video tracks. HoloSuite can take multiple recordings of a subject and precisely blend them together, giving the end-user an uninterrupted experience as they follow various paths based on their choices. These options will also give creators a way to explore new forms of digital storytelling and build AR/VR experiences.
Blend is a result of the years of research and development the Arcturus team has spent on volumetric video, combined with requests from volumetric content creators. Beta testing will continue over the next few months, fine-tuning the Blend functionality for compatibility and stability, after which Blend will become a standard feature. All current HoloSuite users will be able to participate in the beta.
As well as Blend, HoloSuite will receive a series of ease-of-use improvements and upgrades. Unity artists using the HoloSuite plugin will now have better OMS (Open Media Stack) playback, which makes better viewing controls for volumetric video files available within the game engine. Further support for Unreal Engine 5 OMS playback is planned as well.
Volumetric capture files from 4DViews can be imported directly into HoloEdit.
New native support for the 4DViews volumetric capture system will allow direct import of files from 4Dviews. Before this HoloSuite update, the files produced from 4DViews all had to be converted to a generic file type before they could be imported into HoloSuite, creating an extra step that could be time-consuming depending on the number of files to convert. Once the files have been imported, HoloSuite will adjust the file type on its own, which means 4DViews users won’t have extra steps before working on their files.
HoloSuite users can generate smooth normals to create better surfaces for re-lighting and, from there, make quicker lighting changes on volumetric subjects in game engines. A smooth shading model makes a lighting calculation at each vertex of a face, and then interpolates the vertices’ colours across the interior of the face. Typically, the normal to each vertex is used to calculate each vertex’s colour.
Environmental lighting can now be created directly within HoloEdit for more dynamic viewing of imported clips. Although the lighting cannot be baked and exported, this feature is especially useful when working with game engines as your final output – you can create the intended lighting in HoloEdit and make any other types of changes that will be needed, before you export to the game engine.
HoloEdit includes multi-bone retargeting.
Previously, once a clip had been exported into a game engine and played with the environmental lighting, the user might realise that the texture or mesh needed adjustment and have to return to HoloSuite. This update helps prevent that extra step. It is also useful as a general review tool to highlight specific areas or to see how lighting may need to work in different outputs before you are ready to export.
HoloSuite is available now. Annual licenses are available for commercial and private use. This update is available at no additional cost to current license holders. Special pricing for education is available as well. arcturus.studio