The NVIDIA Omniverse platform is an RTX-based 3D simulation and collaboration platform capable of simulating photoreal 3D objects and scenes in real time. NVIDIA launched its open beta stage at the virtual GTC event this week.
Using the platform, remote teams can collaborate simultaneously on projects in a way similar to editing an online document. Typical users and applications would be architects iterating on 3D building design, animators revising 3D scenes, and engineers collaborating on autonomous vehicles.
Artists and engineers working in robotics, automotive, architecture, engineering and construction, manufacturing and M&E all need to continuously improve their creative processes and animation pipelines over time. The Omniverse Platform acts as a hub, where new capabilities are exposed as micro-services to connected clients and applications. It aims for universal interoperability across different applications and 3D systems vendors, and its real-time scene updates are based on open-standards and protocols.
Pixar’s USD and NVIDIA’s MDL
The platform supports real-time photorealistic rendering, physics, materials and interactive workflows between 3D software packages. It is based on Pixar’s Universal Scene Description (USD), a format for universal file interchange between 3D applications, directly sharing most aspects of a 3D scene while maintaining application-specific data.
The USD scene representation has an API allowing complex property inheritance, instancing, layering, loading on demand and other features. Omniverse uses USD for interchange through its central database service, called Nucleus (see below).
Materials in Omniverse are represented by NVIDIA’s open-source MDL (Material Definition Library). NVIDIA has developed a custom schema in USD to represent material assignments and parameters, preserving these during interchange between different application-specific material definitions. This standard definition enables materials to look similar if not identical across multiple applications.
USD structure allows you to only relay the changes you have made to objects, environments and other design elements within the collaborative scene, which means edits are efficiently communicated between applications while maintaining overall integrity.
Inside Omniverse – Tools and Services
On top of Omniverse’s USD / MDL foundation, the plaform has five main components – Omniverse Connect, Nucleus, Kit, Simulation and RTX. These components, plus the connected third party content creation (DCC) tools and other connected Omniverse microservices, make up the whole Omniverse system.
Omniverse Nucleus has a set of basic services that various client applications, renderers and microservices use to share and modify representations of virtual worlds. Nucleus works through a publish/subscribe model – that is, Omniverse clients can publish modifications to digital assets and virtual worlds to the Nucleus Database (DB), or subscribe to their changes. Changes are transmitted in real-time between connected applications.
Omniverse Connect libraries are distributed via plugins that client applications use to connect to Nucleus and to publish and subscribe to individual assets and complete worlds. Once synchronised, a software plugin will use the Omniverse Connect libraries to apply updates from outside and publish changes generated from inside – as necessary.
As the application makes changes to its USD representation of the scene, Omniverse Connect keeps track of the differences and publishes them to Nucleus for distribution to subscribers.
Omniverse Kit is a toolkit for building native Omniverse applications and microservices. It is built on a base framework with functionality accessed through light-weight extensions that are plugins authored in Python or C++. A flexible, extensible development platform for apps and microservices, Kit can be run headless or with a UI that can be customised with a UI engine.
The Extensions are building blocks that users assemble in many ways to create different types of Applications. They include RTX Viewport Extensions, Content Browser Extensions, USD Widgets and Window Extensions and the Omniverse UI. As they are all written in Python, they are very customisable and therefore the catalogue of extensions is expected to grow. They are supplied with complete source code to help developers create, add and modify tools and workflows.
In the Omniverse Pipeline, DCC applications, plus those the user has built using Omniverse Kit, can all be exported to the USD file format and have support for MDL materials. Using Omniverse Connector plugins, Omniverse portals are created between these apps and the Nucleus Database. The Nucleus server also supplies functionality as headless micro-services, and delivers rendered results to different visualisation clients - including VR headsets and AR devices.
Simulation in Omniverse is done through NVIDIA plug-ins or microservices for Omniverse Kit. Currently, Omniverse physics includes rigid body dynamics, destruction and fracture, vehicle dynamics and fluid dynamics. One of the first available simulation tools is NVIDIA’s PhysX, the open-source physical simulator used in computer games. The objects involved in the simulation, their properties, constraints and so on are specified in a custom USD schema. Kit has tools for editing the simulation set-up, start/stop and adjusting parameters.
Omniverse supports renderers that comply with Pixar’s Hydra architecture. One of these is the new Omniverse RTX viewport. RTX uses hardware RT cores in Turing and upcoming NVIDIA architectures for real-time ray tracing and path-tracing. Because the renderer doesn’t rasterise before ray-tracing, very large scenes can be handled in real-time. It has two modes – traditional ray tracing for fast performance and path tracing for high quality results.
Omniverse RTX natively supports multiple GPUs in a single system and will soon support interactive rendering – in which the rendered image updates in real time as changes are made in your scene – across multiple systems.
Early Access and Software Partners
The open beta of Omniverse follows a one-year early access program in which Ericsson, Foster + Partners, ILM and over 40 other companies – and as many as 400 individual creators and developers – have been evaluating the platform and sending reactions and ideas to the NVIDIA engineering team.
At this time, NVIDIA Omniverse connects to a range of content creation applications, and NVIDIA has created demos, called Aps and Experiences, to show how it works in the different workflows. Apps are built using Omniverse Kit and serve as a starting point for developers learning to create their own apps. They will continually gain new features and capabilities. Experiences, on the other hand, are packages containing all the components and extensions needed to address specific workflows.
Early adopters of NVIDIA Omniverse so far include architectural design and engineering firm Foster + Partners in the UK who is using Omniverse to help with data exchange workflows and collaborative design processes. Woods Bagot, an architectural and consulting practice, is working with Omniverse to set up a hybrid cloud workflow for the design of complex models and visualisations of buildings, and Ericsson telecommunications is using real-world city models in Omniverse to simulate and visualise the signal propagation of its 5G network deployment.
Omniverse has support from software companies including Adobe, Autodesk, Bentley Systems, Robert McNeel & Associates and SideFX. Blender is working with NVIDIA to add USD capabilities facilitating Omniverse integration with its software. The goal is to allow artists and designers to use the collaborative functionality of Omniverse while working with their preferred applications.
Autodesk’s senior vice president for Design and Creation Products Amy Bunszel said, “Projects and teams are becoming more complex and we are confident Autodesk users from all industries will respond to Omniverse’s ability to create a more collaborative and immersive experience. This is what the future of work looks like.”