Unreal Engine 4.24 Grows a Hair System and New USD Support
Epic Games Unreal Engine 4.24 has a new strand-based hair and fur system, new landscape tools, atmospheric skies, USD support and better multi-display rendering. The biggest change is that the Engine now incorporates all of the functions that were previously part of Epic Games Unreal Studio, which will be retired. The change makes the Unreal Engine more useful for visualisation in game development, architecture, film and television, automotive design, training and simulation.
One of the Studio tools included in the 4.24 Engine release is Datasmith, which directly converts 3D scenes from various content creation, CAD and BIM software and sources such as 3ds Max, SketchUp Pro, Cinema 4D, Revit, Rhino, SolidWorks, Catia and others. This integration also adds mesh editing tools, UV projections, jacketing and defeaturing tools, and a Variant Manager.
Assets are imported into Unreal Engine as fully assembled scenes with geometry, lights, cameras, materials and textures applied and in their original locations. Once the data is loaded, all assets become standard Unreal Engine assets.
The strand-based hair and fur system, still in an experimental stage, has been developed to simulate and render hundreds of thousands of photoreal hairs at speeds close to real time from grooms created in 3D applications. It uses a strand-based workflow to render each individual strand of hair with physically accurate motion.
The strands can follow skin deformations for realistic fur and facial hair, giving results that are suitable for human characters and also furry or hairy creatures.. The system includes a hair shader and rendering system and integrated physics simulation.
Unreal Engine users now have access to the entire Quixel Megascans library at no charge, either through the Unreal Engine Marketplace or through Quixel Bridge. It makes over 10,000 very high resolution photoreal 2D and 3D photogrammetry assets available to use when creating scenes and environments in Unreal Engine.
Nondestructive landscape creation and editing tools are available in Unreal Engine in a beta version to create and edit large terrains directly within the Unreal Editor. These tools now make it possible to add multiple height maps and paint layers to a landscape, and to edit them independently of each other.
Blueprints Visual Scripting, a scripting system based on a node-based interface to create elements from within Unreal Editor, may be used to create custom brushes for terrains, such as a brush that automatically fits the height of the landscape to the bottom of buildings.
You can use Unreal Engine to create realistic skies for real-time outdoor environments, including sunsets and space views, very quickly. The Sun Positioner contains a new physically-based SkyAtmosphere component that renders an atmosphere that can be viewed from the ground or from the air, and to show the time of day changing dynamically.
The new Screen Space Global Illumination (SSGI), now in beta, is an alternative to Unreal’s existing ray-traced GI method in Unreal Engine. Users can simulate the effect of dynamic, natural indirect lighting when light bounces off one object affects the colour of another within the screen view. SSGI can also be used to generate dynamic lighting from emissive surfaces like neon lights or other bright surfaces, and is effective enough to scale across desktop and console platforms.
New support for reading USD (Universal Scene Description) files and writing back changes has been added to help users collaborate more effectively with team members and work in parallel to pull together real-time and traditional asset creation pipelines within a project. For example, modellers can refine assets in a 3D package while scenes are being laid out in the Unreal Editor. The updates will immediately be reflected in the scenes.
nDisplay is now redesigned for consistency with standard Unreal Engine workflows, allowing native compatibility with existing projects, and the delivery of replicated inputs and synced visuals across networked PCs.
nDisplay supports visualisation systems that render real-time content simultaneously through multiple displays. These may be either adjacent physical screens creating a Powerwall, or using projectors to project the 3D environment onto surfaces like domes, walls and screens in a virtual environment. It is no longer necessary to use custom Pawns and Game Modes.
When creating a new project in Unreal Engine, users are now guided through a wizard to choose an industry category, and then select from a series of relevant templates. The required plugins are automatically loaded according to the selected template, and settings are properly configured. This tool makes it easier for new and existing users to get projects started with the settings most appropriate for their industry or task. Alternatively, users can create their own templates and initial settings to share with teams.
With the 4.24 release, Epic Games is also releasing ‘Apollo 11 Mission AR’ (above) as a project sample on the Unreal Engine Marketplace. Built for the Microsoft HoloLens 2, the project demonstrates various aspects of the historic mission in detail, including the launch itself, the Saturn V rocket, the lunar landing and Neil Armstrong’s first steps on the moon, recreated from real-world data and footage from the mission.
By releasing this project as a sample, any user can see how the interactive experience was built from the ground up using the current Engine functions, and gain insight into how the software can be used to create similar projects. www.unrealengine.com