Nozon and V-Ray Navigate Photoreal 3D Spaces with VR Artists
Nozon and Chaos Group have formed a technology partnership resulting in a software pipeline artists can use to create pre-rendered, volumetric VR. Using PresenZ and V-Ray, artists can design spaces with free movement and full parallax, without the help of a game engine.
Positional tracking has been recognized as necessary for successful immersion for some time. Without it, the genuine sensation of presence within an artificial environment is broken and the user feels uncomfortable. Game engines have been the standard place to incorporate the six degrees of freedom [forward/back, up/down, left/right, pitch, yaw and roll], but image quality and scene complexity are compromised in game engines.
The partnership was formed as a way to use offline rendering to apply movements to their VR that add realism, like head motion or walking around inside an image. Because it is planned ahead and completed offline, the artist decides how high a polycount or scene density he or she wants to manage.
Creating volumetric VR with PresenZ follows the traditional 3D workflow. Scenes are built in a 3D application and rendered directly in V-Ray, creating lifelike VR that allows greater movement. It’s a relevant approach for CG artists. One of the most challenging aspects of VR is that, to a certain extent, it loses the cinematographer or DP, who has traditionally designed and controlled the composition and POV the viewer sees - before he sees it.
The DP shares his artistic vision with the CG artist who then knows exactly how much work is required of him at any moment in a movie – the DP calls the shots, the artist creates them and the viewer looks. In a VR environment the viewer has to be his own DP and decide, within Presenz’ or another VR system’s limitations, what his POV will be at every moment. The artist doesn't speak to or even meet the viewer, and because second-guessing where he is going to look isn’t possible, has to create and pre-render every view that he might see.
Pre-rendering makes it possible to make the six degrees of freedom available to the viewer. Most VR systems have operated on the assumption that the viewer’s head was located at a fixed point; he can spin his head around on that point, and up or down, but not tilt or swing his head. This isn’t only limiting but causes discomfort should the viewer move his head around very much while viewing. However once a CG team has built a 3D environemt that can accommodate all of those views, PresenZ helps supply the imagery for the viewer as he looks about.
Zone of View
PresenZ helps overcome the dilemma that opening up these new views potentially creates for a CG team. Not only is producing all of the necessary geometry, textures, lights and so on a lot of work for the artist, but rendering the views to a high enough standard also creates a technical impasse that PresenZ addresses by intelligently limiting the CG rendering. It gathering just enough information to cover what is visible from the allowed zone of view.
The intelligent limiting and the zone of view are what the Presenz developers are offering. Moving on from point-of-view, their software works within a zone-of-view – a volume around a central viewing point in which the viewer can move his head that covers all the positions your head can reach while staying seated, a volume about one metre deep and half a metre high. Other limitations are its application to full CG movies only – as no existing camera can capture the scope required – and fog and translucent smoke effects, which at this time confound the system. However, it does support animation, including motion capture, and integration of actors captured stereoscopically at a distance from the cameras.
PresenZ’s V-Ray integration is currently in private beta, and the app can be downloaded on Steam. Nozon will be exhibiting PresenZ as part of Chaos Group’s V-Ray Days event at SIGGRAPH 2016, outlined below. The talk will be held on 26 July and include a demo plus recent use cases.
V-Ray at SIGGRAPH 2016
The V-Ray Days two-day promotional series at SIGGRAPH presents visual effects created by several production studios and, this year, emerging rendering systems in volumetric VR and live VR rendering as described above. Studios include Encore VFX, Digital Domain, Blur Studio, Square Enix and others. See the schedule here.
Chaos Group’s co-founder and head of software development Vladimir Koylazov will also show a new live VR rendering solution currently under development. Using a headset and progressive rendering, artists and designers will be able to author VR content in a live environment, gaining instant feedback on any change they make. See Chaos Group’s SIGGRAPH website.
Chaos Group is presenting at SIGGRAPH’s Scratching the Surface series, which highlights new tech papers and achievements. The talk will be on a new stochastic algorithm that makes rendering mirror-like flakes for materials like car paint, sand, and snow faster, more memory-efficient and capable of GPU-acceleration. www.chaosgroup.com