V-Ray Next for Maya Develops Scene Intelligence, Faster Interactivity
V-Ray Next for Maya is a new version of the V-Ray renderer that applies scene intelligence to Maya production workflows for VFX and animation projects. As well as speeding up render processing through new functionality and optimisations, improvements to the IPR (interactive render engine) and viewport rendering make it easier to concentrate on looks and visual details that make scenes and characters more realistic and interesting.
According to Chaos Group's estimates, V-Ray Next for Maya's overall rendering performance is now 25 percent faster on average than earlier versions of V-Ray for Maya. Some of the speed improvements are due to scene intelligence, which analyses and optimises render calculations automatically with no added input. Scene intelligence has been integrated with new tools like the Adaptive Dome Light, which overcomes some shortcomings of IBL, image-based environment lighting.
Adaptive Dome Light
IBL matches a spherical dome light with a HDR image of the environment, but because dome light can include light from unpredictable, unexpected sources, sampling may be inefficient and create noise. The new Adaptive Dome Light is able to use the Light Cache calculation phase to learn which parts of the dome light are most likely to affect the scene. In other words, it automatically determines which portions of the environment to sample and which ones to ignore, saving time and improving accuracy.
The V-Ray Next has a better IPR that runs directly from the Maya viewport or the V-Ray Frame Buffer, to speed up interactivity and workflows. IPR, the software's interactive render engine, uses CPU and GPU acceleration to display the artist's updates on screen - in real time as they are made - to objects, lights and materials within the scene. As well as reducing the time to compile the geometry and show the first pixels, improved interactive rendering displays continuous updates while editing or scrubbing through animations. It’s also now possible to render V-Ray-quality playblasts for animation previz.
For look development work, users can analyse and fine-tune specific aspects of their scene with V-Ray Next for Maya's new Debug shading mode, which isolates selected materials, textures, objects and lights. A GPU-accelerated AI Denoiser now makes noise-free updates, fast enough for interactivity and to give quick insights into lighting set-ups.
V-Ray GPU Next
V-Ray Next for Maya also uses V-Ray GPU. As a production-ready real-time renderer for professional use, built on a new GPU rendering architecture, V-Ray GPU is now about twice as fast as the previous version. V-Ray GPU Next supports fast rendering of volumetric effects such as smoke, fire and fog, and the inclusion of GPU bucket rendering supports faster distributed rendering.
GPU Volume Rendering
It also supports Cryptomatte output, developed to give compositors more control. Cryptomatte multichannel images store ID, original ID names and pixel coverage pairs, plus extra metadata. The compositor can extract mattes based on selected IDs, and then use them to mask out specific elements for fine-tuning the composite.
Materials and Camera
V-Ray Next for Maya has a Physical Hair Material that produces realistic-looking hair with accurate highlights, and has new glint and glitter controls. For compatibility with materials from Substance Designer and real-time engines like Unreal and Unity, the V-Ray Material for PBR shaders now includes Metalness. Toon Shader is a new cel shader for non-photorealistic cartoon effects, and the line controls in VRayToon are upgraded. Layered textures use blend modes and individual masking controls.
The Physical Camera now has automatic white balance and exposure controls, and adds rolling shutter motion blur. Lens effects such as glare and bloom are faster, simpler and ready to composite. New workflows in V-Ray Next for Maya support Alembic 1.7 with layering, to help make handling and updating of Alembic data faster and more efficient.
Cloud support for V-Ray, developed to render scenes directly to the cloud in a single step, is now in open beta. www.chaosgroup.com