VP Pro XR
Mo-Sys VP Pro XR is the first purpose-built Cinematic XR server solution on the market. It’s a radical new approach to delivering cinematic standards to on-set real-time virtual production using LED volumes. It’s designed for LED stages with or without set extensions but can also be used with blue/green screens. It has been designed to enable traditional shooting techniques within an LED volume, with a focus on composite image quality.
Typically, directors and cinematographers must make continuous calculations to make the real and virtual elements match in post-production. This is often a costly and repetitive process. A pioneer in providing a platform for graphics rendering as well as camera and lens tracking to create higher quality virtual productions, Mo-Sys has made it even easier to produce seamless, high-end productions with the latest iteration of VP Pro XR, which combines the power and capability of professional systems with the easy operation of prosumer solutions and with affordable scalability. VP Pro XR offers seamless set extensions, confers a minimal XR delay and includes unique capabilities such as Cinematic XR Focus (see below).
Mo-Sys VP Pro XR is designed to enables the use of traditional shooting techniques within virtual productions and to remove the limitations on the ability to tell stories that are imposed by current XR stage designs. Mo-Sys has added a range of innovative capabilities to VP Pro XR, which supports Epic Games’ Unreal Engine 4.27 including:
Cinematic XR Focus is an industry first product that enable seamless interaction between virtual and real worlds. Winner of the Cine Gear 2021 Technical Award, this feature ensures that an LED wall can be used as more than just a backdrop, allowing it to integrate with the real stage. This gives cinematographers the means to seamlessly rack focus deep into the virtual world and create layered images to enhance their visual storytelling. This methodology saves both time and money by enabling shots to be combined in a way that they will be familiar with.
Mo-Sys achieves this capability by intuitively using the same wireless lens control system commonly used in filmmaking and is compatible with Preston wireless lens controllers (Hand Unit 3 and MDR-3). The lens controller is synchronized with the output of the Unreal Engine graphics, working with Mo-Sys’ StarTracker camera tracking technology to constantly track the distance between the camera and the LED wall.
NearTime® is a fresh and unique workflow for virtual production. It meets the key requirements of cast and crew seeing the full effect of the shot on-set in real-time, and delivers a higher-quality version of the shot, completely automated, and in a timescale which matches the practical requirements of the production – ‘near-time’. Winner of the HPA Engineering Excellence 2021 award, this solution is cost-effective, uncompromising in quality and timely without the huge overheads of real-time augmented reality. NearTime draws on the proven Mo-Sys expertise in camera tracking and live compositing, delivering a complete system in partnership with the AWS Media and Entertainment team.
On set, the time-optimized render is driven by the Mo-Sys StarTracker camera and lens movement system. That gives the DP complete freedom to move the camera in any of the six axes of motion. Because lens data is tracked, the DP can also use the same aperture and focus work that are a central part of the language of movies, and see it resolved seamlessly in both the real and virtual parts of the scene.
As soon as the foreground video of a shot reaches the cloud, final rendering automatically and immediately starts, without the need for any operator intervention. Mo-Sys uses the Frame.io platform to deliver these renders back to set and available securely anywhere in the world, so the PA and DIT can instantly add comments to takes or delete those which obviously cannot be used.
The automatic render and composite are not locked in terms of content. If additional work is needed it can be added at any time. What the automatic rendering and layering achieves is the elimination of dull, repetitive work, allowing post-production artists to focus on rewarding, valuable, creative work.
Just as rendering starts the moment the take finishes, so the composite is made available to editorial the instant the render finishes. That means the editor is always working with the finished images: both editors and directors have told us they hate hopping in and out of resolutions and render quality, so this is another valuable boost to productivity.
While a two-minute take might require an hour to be re-rendered, because the AWS cloud is infinitely scalable multiple takes can be rendered in parallel. Take two will not take two hours to be delivered: it too will take an hour from the camera stopping. Directors and DPs can work as quickly as they want. Finished renders are delivered to computers on set for immediate confidence checks.
A new addition to this latest version VP Pro XR is an Online Lens Library giving users access to a wide selection of lens distortion calibration tools and tweak their lenses on-set in a highly cost-effective way.