In Substance 3D’s most recent releases, two new Firefly-supported features have been integrated directly into Adobe Substance 3D design and creative workflows.

Adobe substance 3d firefly

In Adobe Substance 3D’s most recent releases, two new Firefly-supported features have been developed directly within Substance 3D design and creative workflows. This is the first integration of Adobe’s collection of Firefly generative AI models into the Substance 3D software applications. For industrial designers, game developers and VFX artists, the integration is intended to accelerate iterative and creative processes for various tasks, including 3D texturing and background image generation.

The Text to Texture tool in Substance 3D Sampler makes it possible to generate photorealistic or stylised textures for 3D object surfaces using regular text prompts. This feature is expected to help artists and designers during their creative and iterative stages, without relying on physical prototypes, stock imagery or manual photography to get started.

Adobe substance 3d firefly fabric

Images generated from the process are square and tileable, and will have the correct perspective to work within the material generation pipeline. Sampler's Image to Material function, based on machine learning, analyses the generated material and creates the normal, height and roughness maps needed to use the texture as a parametric material – customisable with built-in parameters – with either a high level of realism or stylised looks.

Similarly, users can create distinctive, detailed background images from text prompts using Substance 3D Stager's new Generative Background feature. When one of these background images is selected, Stager’s Match Image function, also running on machine learning, will accurately composite 3D models into the generated environment, automatically matching perspective and lighting for harmonious, natural-looking results.
 
For industrial design, game development, brand presentation and visual effects, according to Adobe, these updates can lead to quicker formation of ideas, a wider scope for creativity and the ability to generate high-quality, realistic textures and environments in less time.

Adobe substance 3d firefly textures
 
By default, Adobe Firefly attaches Content Credentials to assets created or edited using Firefly, indicating that generative AI was used in the creative process. Content Credentials are verifiable details that serve as a digital origin label. They can show contextual information including an asset’s name, creation date, the tools were used, and any edits made. Running on free, open-source tools from the Coalition for Content Provenance and Authenticity (C2PA), this data remains associated with the content wherever it is used, published or stored, enabling attribution and informing decisions about digital content.
 
Beta versions of Substance 3D Sampler 4.4 and Stager 3.0 featuring the new Text to Texture and Generative Background features were just shown during Substance Days at GDC 2024 in late March San Francisco, and are now accessible to Substance 3D customers. www.adobe.com