Software developers at Vicon and Rokoko are making it simpler to turn real performances into authentic animations for digital characters. The results are fast and accurate enough for VR experiences and games. In July 2020, Vicon released v1.3 of its intelligent VR software platform Evoke to support full-body VR experiences with higher accuracy, more reliable tracking and the ability to auto-assign characters to the markers participants wear. Just recently in September, Rokoko released a new inertial system for accurately tracking hands and fingers called Smartgloves.
Evoke software is part of Vicon’s complete virtual reality capture system, Origin for VR, in which each participant is identified and tracked with a customisable system of LED tracking clusters. Origin hardware consists of Viper and Viper X cameras specialised for accuracy and continuous tracking, the Pulsar tracking clusters of LED markers that participants wear during the experience, and an RF networking device called Beacon that locates and wirelessly links the cameras and clusters into a network. This hardware network creates Origin.
Origin’s Evoke software forms a bridge between Vicon’s camera-cluster network and the customer’s VR experience, which makes Origin an autonomous, start-to-finish motion capture package for virtual reality experiences. Evoke integrates directly with Unreal Engine and Unity so that third-party developers can connect the Origin data with their own applications, including characters and environments.
Important updates in Evoke’s version 1.3 this past July include real-time system auto-healing that recalibrates cameras that fall out of alignment so that sessions continue uninterrupted without operator interventions. Continuity is critical for VR because the whole experience needs to unfold in real time to remain immersive.
Also, assigning characters to participants at venues is now faster and partly automated. Origin uses a 6-cluster skeleton, attaching small clusters of eight LEDs on the head, hands, back and feet. Operators now only need to identify a single reference object, such as a headset, for each participant. From there, Evoke will automatically assign all six of the participant’s marker clusters to the character’s limbs, shortening the set-up and calibration time by a large margin.
Speed and ease of use at venues is an important factor for operators, who need to consider their return on investement into a VR experience. The quicker and simpler the set-up process is, the fewer staff they will need and the more participants they can handle.
Characters from Clusters
Digital Media World asked Tim Massey, product manager at Vicon, for more detail about how their hardware and software systems work together, and what the recent updates mean for users. “The auto-assign and calibration functionality is quite similar to Vicon’s camera calibration process, which uses a T-shaped device that holds a fixed pattern of markers – in this case, infrared LEDs,” Tim said. “This device serves as a model that our system stores and recognises from the marker clusters’ positions relative to each other.
“The calibration process works by moving this device in front of multiple cameras at once. Within a few minutes, the recorded observations can then be used to automatically work out the markers’ positions relative to each other, tracking them at any time during the experience in Evoke.
“Similar to the T-shaped calibration device, our Pulsar clusters are a fixed array of eight assymetrically placed LEDs that have a series of known patterns. These can be manipulated by turning different LEDs on and off via radio communication to create unique patterns that we can then visually track to identify each character and its position in 3D space, at any time. It allows us to track the clusters accurately, in real-time with all markers visible in every frame – all while still being able to identify them from each other continuously.”
Large Group Tracking and Latency
Further to calibration, proximity grouping of clusters allows cluster patterns to be associated solely with the reference object the user identifies during calibration, while Evoke’s pattern-matching algorithms keep the associations consistent from frame to frame. This means that, although the number of unique patterns is limited, patterns can be repeated, ultimately allowing large numbers of clusters to be tracked without error or interruption at VR-suitable latency.
Tim said, “Vicon Origin was created to help location-based entertainment developera build free-roaming VR scenarios for multiple players at once. It’s already been possible for a long time to track several objects at a very low latency, and Vicon Tracker has done this successfully to really low levels of 3.5ms. Today’s challenge, however, is developing systems that support even larger groups.
“For example, Vicon recently partnered with Europa Park, MackNeXT and VR Coaster to create a location-based virtual reality entertainment platform in Rust, Germany with near-infinite free roaming environments and full-body tracking of 32 simultaneous players. A 32-person VR system means tracking upwards of 200 objects (or character’s limbs, marked by clusters) at latencies acceptable for real time VR. The ever-improving processing times CPUs can now achieve are certainly helping, but we’ve also had to optimise how we process this data specifically to cope with the demands of such a system.”
For Tim, automatic camera self-healing is probably the most significant feature in the 1.3 update. For anyone who has used a motion capture system before, he says, having to stop and re-calibrate is irritating, inconvenient and undesirable. “Vicon always focuses on minimising the need to do so. But with Origin and Evoke we were able to take a fresh look at this, because our customers can’t stop an experience half-way through and ask the participants to pause and recalibrate,” he said.
When turned on, auto-healing runs continuously in the background. The software evaluates the data it is receiving from each camera individually in real time and checks for any disagreement from the other cameras. If it disagrees, the software removes that camera from the system automatically, and uses its observations, similar to the full calibration described above, relative to the other cameras to reassess and reposition the camera as needed. It is then checked and returned to the system. The process takes a few seconds and happens entirely in the background.
“In use, it’s simply amazing to watch,” said Tim. “We’ve seen it in action at busy trade shows with crowds of people or at VR arcades where waiting players lean or kick on the frame holding the cameras, causing it to move or vibrate. Without healing, this would have required a complete stop and recalibration. With healing, the experience can continue uninterrupted with the players in VR sometimes not even realising anything has happened.”
Active Pulsar Markers
Tim also noted that Origin is specifically designed for active Pulsar marker clusters, and that is its primary use. “Active Pulsars are particularly well-suited to a VR experience with members of the public,” he said. “They’re more rugged than passive markers and take more punishment, which is a key factor, as anyone who has worked in such environments will know. It also means we don’t need to use an infrared strobe on the cameras, which avoids possible interference with other VR equipment and creates less IR background noise in the environment, making the healing algorithms more effective.
“There are downsides to active markers too, so they need to fit the application for the tracking system – as a motion tracking company, Vicon can cater to both. Active markers, for example, need a power supply and recharging which requires managing and means extra weight. They are also usually visible from a reduced angle compared to passive markers. That is why tracking results with passive markers are often slightly better – we typically recommend them for non-VR use cases. It doesn’t cause a problem in VR environments, but in a medical context where accuracy is paramount for a good medical diagnosis, users have different needs.”
Interestingly, although the entire Origin range was originally developed specifically for large VR location-based entertainment (LBE) experiences, many of its advantages translate well to VR simulations other than entertainment, such as training applications that normally aren’t as large scale. However, many of the advantages still apply. Vicon is broadening its use of active markers to certain use cases, and it will depend on the needs of customers to identify what will work best in each situation. www.vicon.com
Smartgloves is a new inertial system for accurately tracking hands and fingers, made by motion capture developers Rokoko. The system’s hardware captures individual fingers precisely, preserving the fine details of hands in motion and recording the full range of an actor’s hand performance. The gear is also sturdy, to hold up during fight sequences. It gives VR game developers and VFX and digital artists a faster way to create more realistic, expressive characters.
Using Smartgloves also means one less element of the body to animate manually. Instead of animating individual finger movements, which is very time-consuming, or doing without, artists can put on the Smartgloves themselves and act out the movements they need. The gloves track movement using the Inertial Measurement Unit (IMU) method, which is the same system used to power Rokoko’s wearable Smartsuit Pro system that records skeletal movements of the performer's entire body.
The IMUs built into Rokoko’s systems are small electronic sensors that measure and record the body's specific force, angular rate and orientation, combining measurements from accelerometers, gyroscopes and in some cases magnetometers. Because they are not optical and don’t rely on a camera array to detect visible markers on actors performing inside a volume, Rokoko gear can be set up and used in most situations, at a location or on a stage.
Calibrating Smartgloves before recording takes about one minute, and then connects through WiFi to the computer or smart device where the gloves are automatically synced with other hardware inputs, and where you can edit and refine the recordings.
Data the user generates is exported, processed and edited as part of a larger digital scene, or it can be stored to use later. Rokoko Studio is the default data editing software, available to all Rokoko capture system users free of charge, and the data can also be live streamed directly onto a custom character in the major 3D tools. This is done using FBX, BVH or CSV export formats with Rokoko’s native plugins. Plugins are available for Unreal, Unity, Houdini, Maya, Blender and MoBu, or developers can build their own custom plugin.
With Rokoko Studio Live, it’s possible to live stream and sync one or multiple motion capture inputs into the same scene, and then live forward this data to several third party packages at the same time. Here is what some raw data looks like, captured with Smartgloves and Smartsuit Pro -
The gloves are made of stretchy fabric and leather, with seven sensors placed to avoid restricting movement during capture, and open fingers to allow interaction with real objects. They come in different sizes accommodating most hands, because they need to fit tightly to keep the sensors in place. The sensors detect and track the 6 degrees-of-freedom with 3D orientation accuracy of ± 1 degree, embedded into the fabric and operating at 400 Hz. These particular motion trackers do not include magnetometers, to make them immune to magnetic distortion.
A hub placed on the back of each hand collects the tracking data from all fingers and the underarm, then fuses and live streams the data to a PC. Depending on the batteries used, the gloves give up to 6 hours of operation time, and up to 100m of wireless range depending on the WiFi access point, streaming at 100 frames per second. About 20 ms of latency is expected on a standard WiFi router.
Smartgloves can be used standalone, to focus only on hand movements, or as part of a larger tracking system by integrating it into most other mocap systems, both optical and inertial, as well as Rokoko Smartsuit Pro. Smartgloves users also receive frequent software updates, introducing more functionality and increasing compatibility, at no extra cost. www.rokoko.com