SCRATCH / SCRATCH VR Suite 8.6 Tackle Stitching and Expand HDR Tools
ASSIMILATE has launched an open-Beta for SCRATCH 8.6 and the SCRATCH VR Suite 8.6, the new versions of its real-time software and workflow for processing VR/360 and 2D/3D content from dailies to conform, grading, compositing and finishing. Both open-Beta versions give artists an opportunity to use all of the proposed tools, while evaluating and submitting requests and recommendations for extra functionality or modifications.
SCRATCH Web for cloud-based, real-time review and collaboration, and SCRATCH Play for review and playback, are also included in these updates and the Beta. Both products support VR/360 and 2D/3D content.
In the SCRATCH VR Suite 360, new stitching functionality allows users to load all source media from a 360 camera rig into SCRATCH VR and combine it into a single equirectangular image. Supported camera stitch templates include AutoPano projects, Hugin and PTStitch scripts.
SCRATCH VR can load, set and playback ambisonic audio files to complete a 360 immersive experience. Video with 360 sound can be published directly to YouTube 360. Extra overlay handles in the existing 2D-Equirectangular tool make it easier to position 2D elements in a 360 scene.
Working with HDR video has now been improved in both SCRATCH and SCRATCH VR, including PQ [Perceptual Quantizer] and HLG [Hybrid Log-Gamma] transfer functions – both PQ and HLG are SMPTE standards - as an integral part of SCRATCH colour management. SCRATCH scopes automatically switch to HDR mode if needed, showing levels in a nit-scale, and highlight any reference level that you set.
At the project level, users can define the HDR mastering metadata - colour space, color primaries and white levels, luminance levels and other parameters. The metadata is automatically included in your video HDMI interface for display – including such hardware as AJA, Blackmagic Design and BlueFish444. Static SCRATCH metadata is able to calculate dynamic luminance metadata like MaxCLL and MaxFALL, and can be used to publish HDR footage directly to YouTube.
Format support is extended with the option to create 10-bit H264, .mp4 or .mov output, which is important for generating HDR output. The files contain all HDR statistics - that is, your colour space, transfer functions and mastering luminance levels. Version 8.6 contains all recent standards transforms as published by the Academy, and introduces ACEScct, a derivative of ACEScc - also referred to as ACES log - that has a different fall-off in the shadows of an image.
A new high-speed DNx MXF encoder is added for OpAtom and Op1A, including DNxHR, and a more efficient DNx MXF decoder now reads OpAtom, OP1a and Op1B variants. The encoder includes audio channel name information within the OpAtom audio filename so that info remains available in an AVID round-trip. H.264 maximum output resolution is now increased to 8192x6144 - higher resolutions especially make a difference in VR environments. QuickTime and XAVC MXF playback is quicker.
The DNG raw reader can read a wider range of DNG variants, including the new Panasonic Varicam LT DNG and Kinefinity RAW, and the Debayer function is improved. New support has been added for several still camera raw formats such as Canon, Nikon, Kodak, Hasselblad, Pentax, Leica and others.
For DITs, the reporting function now makes it possible to create a report of all clips from a timeline, a whole project or just a selection of shots. Reports include metadata, such as a thumbnail, clip-name, time-code, scene, take, comments and any metadata attached to a clip. Prefined templates are available, or users may create their own.
Other updates have been made to the UI for content creators, such as for the log-in screen, matrix layout, swipe sensitivity, Player stack, tool bar and tool-tips.