Ncam Develops Depth Keying for Broadcast Augmented Reality
Ncam has developed a new approach to real-time depth keying for live augmented reality productions that allows presenters to move freely around and interact with virtual elements. The company is demonstrating their new system at IBC2016.
Ncam’s camera tracking makes sure that the virtual graphical elements added to live-action studio or location shoots remain in place, however the camera moves. Most existing systems encounter limitations when live elements such as presenters are either in front of or behind each graphic element, which makes moving around the live-action difficult and awkward.
Ncam overcame the problem by developing its depth keying capabilities enough to determine precisely how far away from the camera the presenter is. At NAB 2016 this was demonstrated through the use of the Unreal games engine. Now, RT Software has implemented a means of interacting with this depth information in its own broadcast graphics engine, via integration with Ncam’s SDK. Ncam and RT Software have already collaborated on some very effective AR applications. The installation at BT Sport in the UK, for example, is shortlisted for an IBC Innovation Award for its advances in the use of augmented reality to engage audiences.
One of BT Sport’s uses of augmented reality is to place pop-ups of players in position on a football pitch, with the cameras moving around the team line-ups. The new depth keying system will allow presenters to walk between and around the pop-up virtual players as they discuss their capabilities, creating a direct visualisation of game analysis. Solid image placement would still supported, even with random movement of the camera and multiple cameras.
Nic Hatch, CEO of Ncam, noted that although people who saw their demonstration at NAB were impressed by it, many remained wary because it depended on a technical platform – a game engine – that they weren’t familiar with. Therefore, RT Software’s addition of Ncam’s depth capabilities to its standard software, which many broadcasters already have, opens an exciting opportunity.
Ventuz Technology Develops Integration with Ncam
Ventuz Technology, a developer of 3D real-time systems for presentation, event and broadcast graphics, has also integrated with the Ncam Live real-time multi-sensor camera tracking system. The integration establishes a tracking data workflow between the Ncam system and real-time 3D Ventuz content, which results in a precise communication between the graphics and actual camera movements for virtual or augmented projects.
David Paniego, Product Marketing Manager of Ventuz Technology, said the flexible nature of the Ncam system it what makes it a good match for Ventuz software. It does not require a complicated set-up procedure or markers to be placed in the environment. Instead, it automatically identifies and tracks natural features via a bar with multiple sensors, mounted to the main shooting camera. This simple design adapts well to camera tracking on location or shoulder-held shots, for example, and allows a level of flexibility that has contributed to its adoption for broadcast and film around the world.
David also said, “We also believe that the combination of our products will work well with the Live-Link we recently introduced between Cinema 4D and Ventuz, resulting in an improved material engine, for more realistic results. This integration shows the focus of the current version of our software, Ventuz 5, which supports the implementation of various systems into the real-time graphics environment.”
He believes the combined systems won’t only be applicable to the broadcast market but to the wider professional AV industry. He says Ventuz customers have been interested in the collaboration, especially from the East Asian market, where the Ventuz software has been used for a number of AR projects, most recently by CCTV for their Sports Awards Ceremony. www.ncam-tech.com