MEPTIK and disguise Capture a World Beyond in Real-time for Fleurie
MEPTIK experience design agency in Atlanta is best known for its speciality in dynamic live environments for live events. But on a recent music video project, they took the opportunity to use their expertise in Extended Reality (XR) environments to make a striking change to their onset video production workflow.
In the final video, called ‘A World Beyond’, the singer/songwriter and musician Fleurie performs in an XR experience that surrounds her with the starry heavens, underwater elements and city streets among other dynamic environments. However, unlike traditional greenscreen productions, instead of greenscreen, the singer was surrounded with the moving imagery itself, generated and displayed on the set in real-time.
disguise xR Mixed Reality Workflow
During the shoot, MEPTIK created the content in Notch, software that is used to produce live, interactive video effects and motion graphics. One of the primary applications of Notch’s visual effects engine is to create effects around tracked moving performers on stage by integrating with media servers, in this case disguise gx2 servers running the disguise xR Mixed Reality (MR) workflow – artists create in Notch Builder, and then export to and play back inside the media server.
xR combines camera tracking and real-time rendering to create an immersive virtual environment, visible live on set and shot directly in camera. Using high resolution LED screens, MR allows actors to be immersed in a virtual environment. The camera tracking enables the real-time content on these screens to be rendered from the point of view of the camera.
To keep the server constantly informed of Fleurie’s precise position, they also tracked her in 3D space with a BlackTrax vision-based motion tracking system that connects to third party applications, like the disguise media servers. The system’s tiny LED beacons tag performers within the calibrated interactive display area, and pulse at a specific rate with their own unique IDs. Users associate the beacons with the performers in BlackTrax’s UI, and sensor lenses positioned around the tracking space capture the motion of the beacons and feed the real-time positional data to the media servers.
Fleurie on Camera
Working through the LED screens driven by the disguise gx2 servers and the BlackTrax motion data, the environments were recorded in camera, on-set, but were also interactive. The set-up allowed changes to be made in real-time, and also allowed the talent to interact with the visuals on the screens.
The disguise gx2 servers manage virtual environments.
Throughout production, the MEPTIK creative team needed to work with director Elle Ginter to produce the imagery that she felt would capture the desired musical moods. Both Elle and Fleurie were able to see exactly what the finished product would look like without having to go into post-production.
Nick Rivero, co-founder and CTO of MEPTIK believes this kind of visual, interactive workflow will modernise the traditional filmmaker’s creative process by removing, or at least shortening, protracted and expensive post-production cycles. “Instead, it allows us to make real-time editing decisions on the set, and save our clients time and money by only having to capture once. Meanwhile, we are creating a new experience for the performers by replacing an empty green stage with elements they can see and believe,” Nick said..
Nick noted that the disguise xR workflow and toolkit’s particular value lies in the level of control it gives directors and designers within immersive production experiences, which he views as an extremely meaningful step forward for larger film, television and live event productions in the coming years. www.meptik.com