Glassbox Technologies DragonFly professional virtual camera software has now been upgraded to version 2.0, to include better camera tracking capabilities including support for HTC VIVE’s VIVE Tracker (above image), more flexible workflows, new floating license options and live motion capture support. DragonFly 2.0 delivers more options for independent content creators and studios needing a flexible virtual camera system adaptable to either remote or in-studio workflows.
“After working with our clients over the past few months, we identified what would make the biggest difference to them in the midst of these challenges times,” said GlassBox CPO and co-founder Mariana Acuña Acosta. “Because content creators are looking for flexibility to work remotely with various hardware and software production systems, we have aimed to make the process easier.”
DragonFly is GlassBox’s professional virtual camera made for use during preproduction with Unreal Engine, Unity 3D and Maya. Users can view their CG environments, character performances and scenes as if they looking through a camera at a live-action shoot. Used in previs, virtual cinematography or for virtual location scouting, DragonFly renders shots in real-time through the camera’s viewfinder, an LCD monitor, or iPad. Users see into their virtual world, and can then record, bookmark and create snapshots and real, repeatable camera moves.
HTC Vive Tracker Support
DragonFly 2.0’s main change is native support in the Unreal and Maya versions for the Vive Tracker, HTC VIVE’s device for tracking real objects that users want to take with them into a virtual world. Use of the Vive Tracker in virtual production has been increasing as teams adjust to remote work, due to its low latency and compatibility with VPNs. Glassbox partnered with HTC Vive to develop a ready-to-use system for users, which adds Vive Tracker support to DragonFly’s native list of tracking options and gives users a simplified UI that automates the setup.
The Vive Tracker is a small motion tracking accessory that can attach to real objects, and works with the HTC Vive VR headset. The tracker creates a wireless connection between the object and headset so that the player can use the object in the virtual world. Although the commonest applications are guns, hands or feet and sports gear for playing games, the tracker can also be attached to a camera – even a DragonFly virtual camera – to create mixed reality videos.
Using the Blueprints API
Experiences that overlay graphics on live action video are Augmented reality, and experiences that replace your surroundings with a digital experience are Virtual reality. The experiences that exist between augmented and virtual reality are Mixed Reality, using computer processing to combine human inputs through keyboards, mouse and voice, with environmental input like location and position, lights and boundaries.
“Integrating Vive Tracker support into Glassbox’s virtual cinematography software has opened more possibilities for content creators,” said Raymond Pao, SVP of Products and Strategy at HTC VIVE. “By combining DragonFly 2.0 with HTC VIVE’s tracking system, Glassbox has developed a new virtual cinematic experience, resulting in an effective virtual production workflow.”
DragonFly Companion App
Android PIE operating system support for Livelink Data Streaming for visualising live mocap performances.
Other updates affecting all three DragonFly versions – Unreal, Unity and Maya – include new touch screen controls in the DragonFly Companion App, which users download from the App store to use on mobile devices. Users can now control the virtual camera directly from the viewer with or without the GameVice controller accessory. Instead, two new configurable HUD widgets for lens and shot name, including lower bar customisation, allow you to customise settings for use with iPad or iPhone set-ups.
The Companion App also allows users working with ARKit, Vive Tracker, Optitrack or other specialist tracking systems to use an iPad or iPhone as a viewer.
For the Unreal Engine version, the custom tracking input API for DragonFly 2.0 makes it possible for users of specialist tracking systems, such as MoSys and TechnoCrane, to override the native tracking inputs and work with their preferred system. Also, the BluePrints API adds flexibility workflows by allowing you to set a recording name via an external source, playback their last recording via a button binding, and toggle back and forth between recording and reviewing.
Touch screen controls
Live Mocap Support
Regarding live motion capture support, whether they are working with an optical motion capture system or an at-home volume, users now have Android PIE operating system support for Livelink Data Streaming, which means they can visualise animation and actor performances in real-time in their virtual camera session.
DragonFly can be purchased as an annual subscription from the Glassbox website. Users can also request perpetual license options. GlassBox Technologies has now added floating licenses to license types for all versions of DragonFly. Licenses can now be checked in and out by team members working on different machines, at home or in the studio. www.glassboxtech.com