Updates to to Nuke’s 3D system improves the user experience for manipulating geometry.
Nuke 13.1 continues the improvements to Nuke’s 3D system that Foundry started in Nuke 13.0, adding new 3D manipulators, 3D hotkeys and other update that make the system easier and faster to work with. Upgrades to the 3D transform handles and better controls for the 3D pivot point help move geometry around in Nuke’s 3D space. You can also manipulate geometry in object, world or screen space, placing geometry exactly where it should be inside Nuke.
Hotkeys based on different DCC standards is available, with a new toolbar for selecting the geometry transformation tool. This helps move between Nuke and your particular 3D software without interrupting the workflow.
The Node Graph UX has changed as well with new user interactions. Backdrop nodes, which are used to organise large crowded node trees by grouping certain nodes together with a box placed behind them, can now be resized from all four corners to make them easier to control. There is also a new ability to shake nodes to disconnect them, and then move sections of script around in the node graph and reconnect them elsewhere.
Cryptomatte support in 13.1 has been extended.
Extensions to Cryptomatte support in 13.1 makes Cryptomattes more useful. For example, an extension to the wildcard syntax allows the ‘-’ modifier to subtract matte IDs from a defined wildcard list to make matte selections more precise when using wildcards. A new Encryptomatte node allows you to convert channel data into Cryptomatte data and merge it with existing Cryptomattes.
Unreal Reader – Unreal to NukeX
This 13.1 release helps artists design simpler workflows by adding functionality that makes Nuke work more easily with other tools in the pipeline, in particular Unreal Engine. Using the new Unreal Reader, a new beta feature in 13.1, artists can request data from Unreal from within NukeX. Whether you are using Unreal Engine alongside Nuke for previs, virtual production or final pixel rendering of VFX and animation, you can now apply NukeX tools to real-time projects.
Unreal Reader connects NukeX to an Unreal Editor session, allowing you to choose a map and sequence to work with. You can then request renders for particular frames in the sequence, and also use the node level controls in Nuke to request various AOVs to be generated – all on demand.
Using the new Unreal Reader, now in beta, artists can request data from Unreal from within NukeX.
For more control over the render, it is possible to expose the Unreal render settings and also synchronise the Nuke camera with the camera in the Unreal sequence. This means you can fetch the camera position from your own sequence to Nuke’s 3D space, in order to use projections and texture placement in Nuke's 3D system to augment the Unreal render.
It is also possible to override Unreal's render camera with a camera from a Nuke script, using a non-destructive workflow that overrides the Unreal camera at render time without modifying the original Unreal scene. The Unreal Reader grabs frames on demand from Unreal but once the setting are finalised you can write an entire sequence to EXR and then automatically create a read node to view the results. Adding one of the Unreal console variables and commands gives finer control over the look and feel of the render, for example, bloom quality or shadow resolution.
With Unreal Reader, you can work in Stencil Layer mode, which is very useful for isolating specific objects into their own render layers, generating mattes for specific objects or layering up objects, as in regular compositing. Unreal Reader includes the ability to render an Unreal scene as a cube map, allowing it to be converted into a Latlong, or environment sphere.
The new Unreal Reader in Nuke 13.1 is a fast way to move image data from Unreal into Nukex, giving users the speed and efficiency of real-time rendering with the flexibility and finer control of Nuke’s Node Graph.
The machine learning tools also receive an upgrade in Nuke 13.1
Machine Learning Upgrade
The machine learning tools also receive an upgrade in Nuke 13.1. The CopyCat node has been updated to produce faster, more accurate results with less trial and error, and now works better on a range of resolutions.
A new ability to load third-party models in the Inference node opens Nuke up to community and research-developed PyTorch models. Using a new CAD file creator node, users can create custom CAD files from PyTorch models in the Torch script format, load and run them in the Inference node. With CAD file creator, you can define the model's input and output channels and resolution and create knobs to control the model. Third-party models can be integrated into Nuke, converted into Nuke tools with modifiable knobs and shared across a team. This new workflow is expected to make it easier to use machine learning natively in Nuke pipelines.
Timelines – Soft Effects, Metadata and Bottlenecks
It's now easier to copy soft effects from the Timeline and paste them into the Nuke script.
The Nuke Studio and Hiero Timeline received several updates. The ability to copy soft effects from the Timeline and paste them into the Nuke script is improved, and an improved integration enables you to transfer effects between Nuke and Nuke Studio. Soft effects are real-time effects such as Transform or TimeWarp, processed on the GPU instead of the CPU, that can be added to the Timeline in any of the workspaces.
Artists can copy track items and soft effects and paste them directly into the Node Graph. A new ColorLookUp Soft Effect, a mix control to all Soft Effects, and an Unpremult option to all colour Soft Effects have been added to give more control over how those effects are applied. The updated playhead in Nuke Studio and Hiero indicates which timeline tracks are loaded into each buffer, for quick comparisons between clips.
Metadata in the Timeline is now accessible and consistent throughout the pipeline, making the data more available. Timeline metadata representation now matches Nuke – that is, pipeline code that operates on metadata can be used in both the timeline and the Node Graph. This means being able to use metadata more efficiently in Text and Burnins, and using per-frame metadata in all Soft Effects.
Metadata in the timeline is now accessible and consistent throughout the pipeline.
Review and collaboration support in Nuke Studio and Hiero now includes extensions to annotations. The mouse cursor and timeline annotations can now be sent from Hiero and Nuke Studio via Monitor Out, a feature that previews Viewer images in a floating window or on an external monitor to check the final result, including the correct output transform and aspect ratio. A dedicated Viewer Monitor Out workspace gives you access to the Monitor Out strip and controls, so you can point to particular sections of an image and draw annotations.
As Nuke Studio and Hiero projects become more complex, they take longer to load and more likely to create a bottleneck. Nuke 13.1 reduces project load times by 30%, with plans to continue optimising for further reductions. This update, plus the ability to copy-paste clips and sequences across projects, is expected to make working with multiple projects easier – users can move clips, sequences or complex bin structures from one project to another.
OCIOv2 and Blackmagic RAW Interoperability
Ssupport for Blackmagic RAW contributes to consistency across the pipeline
Nuke 13.1’s support for OCIOv2 and Blackmagic RAW, plus the Nuke-Katana interoperability planned for Katana 5.0, contribute to consistency across the pipeline. Supporting OCIOv2 regulates colour processing between CPU and GPU, and ensures a consistent viewing experience between Nuke and Nuke Studio. As artists move between Nuke and the timeline, colour remains accurate and colour management improves inside of Nuke. Studios can continue to use OCIOv1 configurations in their pipeline.
Nuke’s native file format support will now include Blackmagic RAW which can be ingested directly into Nuke and Nuke Studio and allows users to work directly with the RAW data inside of Nuke. Because the sensor data is retained within the file, users can non-destructively modify settings such as ISO, white balance and exposure as RAW data in post, while maintaining image quality. These changes can be saved as separate sidecar files to customise the look and share the changes without modifying the original image data. www.foundry.com
Supporting OCIOv2 regulates colour processing between CPU and GPU