DMW Awards Partner

SWR’s project explores the potential of virtual studio production for TV programming, based on virtual worlds displayed on a Sony VERONA LED wall plus three cameras tracked with OCELLUS.

Sony 1825 D LIGHT

German broadcaster SWR has launched a Virtual Production innovation project with a three-month trial run to explore the potential of virtual studio production for their programming. Virtual worlds are displayed on a large Sony VERONA LED wall in their Studio 6 at Baden-Baden – directly in front of the cameras as they record. Presenters, hosts and actors perform on sets built between the cameras and these digital backdrops.

Because the background artwork has been created in 3D software, the images can be moved and re-aligned on the LED wall to synchronise with the camera’s position as it changes, in real time. Throughout the shoot, the background maintains a realistic appearance, without waiting for post-production, live in the studio.

Where Gaming, Streaming and Virtual Production Meet

The three-month test focuses on a reality TV show titled Fehler im System (Errors in the System), a show that follows an interactive pen-and-paper game, created in collaboration with the production company Midflight Productions. The whole project considers how the interplay of gaming, streaming and virtual production for TV can produce a result that engages and interests viewers.  

The pen-and-paper format refers to a kind of narrative role-playing game. A game master guides players through the story, describing locations, events and supporting characters, and meanwhile the players take on characters they have invented themselves and decide how they act. They are supported by a set of rules and a roll of the dice that determine success or failure.   

On 24 and 25 October 2025, the studio will be transformed into a dystopian world for four hours each day. Hauke Gerdes, a pen-and-paper games expert known from German web channel rocketbeans.tv, dedicated to gaming and pop culture, will be the game master. Streamers, creators and well-known German actors will set out on a search for clues as part of the game.

Driving the Narrative

The evenings will be broadcast live on twitch.tv/ard on 24 and 25 October, starting at 7pm each night. 

Anja Räßler, Creative Director at Midflight Productions, said, "We are very keen to make the pen-and-paper genre accessible to a wider audience. Its attraction lies in the combination of epic stories, collaborative play and spontaneous creativity. In SWR, we have now found the ideal partner to realise our vision of combining classic role-playing with the new virtual production techniques. The synergy between the tools, the team and the team members is enormous – it's great fun to work together on this new form of production."

The pen-and-paper format is not only innovative as a way to drive the narrative – it also calls for a technical step forward. This project will be among the first multi-camera productions to implement three tracked cameras live in a virtual set-up. To display the virtual backgrounds, SWR has installed a 10 x 4 metre Crystal LED VERONA wall with the new OCELLUS tracking system from Sony, with whom SWR has entered into a development partnership for this project. OCELLUS was developed specifically for augmented reality and virtual production applications in the broadcast and cinema sectors.

Sony1928 D LIGHT

Tracking Camera Motion

To track camera motion, several types of data have to be accurately recorded and then combined effectively in order to support virtual production, on-set previewing and post workflows. OCELLUS automates this work, and consists of a multi-eye Image Sensor unit that works with a Lens Encoder and a Processing Box that, together, integrate the tracking data with camera and lens metadata.

Because OCELLUS is a markerless system, the Midflight team doesn’t need to set up IR markers or stationary tracking cameras. Instead, the Sensor Unit uses Visual SLAM (Simultaneous Localization and Mapping) to capture the real location’s data, indoors or out, and is able to recognise feature points on set. Developed to work with LED displays and green screens, this kind of marker-free tracking also removes reflections and limits correction work in post.

While virtual production reduces the need for post production in some ways, a certain level of post is always necessary. OCELLUS helps ease the process, owing to the accuracy of the tracking information and lens data it records during shooting to support editing, match-moving and other visual effects. It also allows immediate previews of composited footage, on-set or outdoors. That means adjustments can be made in near-real time, while tracking data can be recorded as FBX files onto a SDXC memory card for later use.

The small, lightweight Sensor Unit can be mounted in any position or orientation, as it has eyes on five of its faces. It can capture a large number of feature points at once, which improves the stability of the performance. If at least one image sensor in use captures valid feature points, tracking data can be extracted.

Combined Data Processing

When using Sony cameras that automatically retrieve lens information, users can acquire metadata and sync information via a single SDI cable for the Processing Box. The status of camera tracking, lens data and other information can be continuously checked on an OLED display.

Alternatively, should the camera’s SDI output be unavailable, metadata can also be taken directly from the lens itself using the system's Lens Encoder. A rotary mechanism detects the rotation angles of the zoom, focus and iris rings, which the Lens Encoder passes to the Processing Box. The box is equipped with Genlock input, Timecode input, SDI input/output connectors, and the connector for the lens encoder.

Regarding the background imagery for the screens, graphics-based video compositing can be accomplished in real time. The Processing Box sends tracking, camera and lens metadata in free-d format to CG rendering software – like Unreal Engine which the Midflight team used – via an Ethernet cable.

The planning, construction, setup and calibration of the production technology was carried out in collaboration with the Austrian AV company AV-Professional.

The Real and the Unreal

Sebastian Leske, Head of Cinema Business Development at Sony Europe remarked, "Studio productions face the challenge of invisibly blending actors, presenters and a studio set with a virtually generated background. This works best when the individual components, from LED walls to camera technology and sensors, are coordinated and work together as a well-rehearsed team. When a comprehensive software solution brings these components together in an efficient workflow, creativity gains much more space in the television studio."  

Sony unreal

The six virtual sets used during the pen-and-paper exploration were created in advance using the Unreal Engine real-time graphics software. Though it originated in computer game development, Unreal Engine is finding its way more and more often into television studios. The aim is to integrate real objects in front of the LED wall in such a way that they merge with the virtual 3D sets.

The transitions between physical equipment and digital worlds should appear so realistic that the boundaries are barely recognisable. When both the camera tracking and the artists’ 3D work is accurate enough, the parallax effect can take over and allow objects to move relative to each other to create dynamic spatial depth and a highly realistic 3D effect.

Testing for In-house Productions at SWR

With this project, SWR is evaluating the role that virtual production can play in future in-house productions – whether for entertainment, culture or information. The aim is to make public broadcasting fit for the future through technological innovation and to reach new target groups – especially on digital platforms – sustainably. 

As well as the pen-and-paper Fehler im System show, four other use cases from the genres of scenic documentation/trailers, live-on-tape studio production, challenge formats and social media explanatory formats will be tested and evaluated during the project period.  

"Virtual production with multiple tracked cameras can mark a significant change for the media world in many ways,” said Michael Eberhard, SWR Director of Technology and Production. “While it gives us creative possibilities and makes us more efficient, flexible and economical, it can also open doors for collaboration in public broadcasting, making it a perfect fit for our times. Thanks to my colleagues and the technology partners involved, we have succeeded in installing a test setup at our Baden-Baden site at short notice, which we will be testing extensively until the end of the year." pro.sony