Faceware and Cloud Imperium Games Develop Real Time Face-over-Internet

FaceWare 01

Faceware Technologies, a markerless 3D facial motion capture developer, and Cloud Imperium Games (CIG), makers of the first-person universe game 'Star Citizen', have been working together on real-time, player-driven facial animation, which will soon become a feature of the game.

Face Over Internet

Called Face Over Internet Protocol (FOIP), it will track players’ facial expressions and movement, and then re-project the motion data onto their avatars in-game, allowing immersive, realistic player-to-player communication. CIG demonstrated the new functionality at its Gamescom event on 25 August in Cologne, Germany. 

The face-over-internet system was built with Faceware’s recently announced LiveSDK, which is powered by Image Metrics’ real-time capabilities. Using Faceware’s upcoming facial motion sensor device, now in a prototype stage of development, Cloud Imperium and any Star Citizen player to detect hundreds of facial movements in variable lighting conditions. These movements are then instantly streamed onto the character’s face. The in-game immediacy of the re-projection is critical to creating believable digital faces, as much as the accuracy.

FaceWare 03

Motion Logic

This process works through a range of outputs, including eye tracking, that result from Faceware's LiveSDK. Passing through a layer called Motion Logic, the outputs are mapped onto the character's runtime rig. Each Star Citizen character has its own unique rig so that it reacts to the inputs appropriately, as that character would – in other words, the character's moves don't simply mimic the player's moves.

Now in its early days, while the focus is on improving the accuracy of the motion data mapping onto the character models and the ability to track movement, it is also important to introduce new phonemes into the system so that the runtime logic for each character reacts exactly as gamers expect. It's important to remember that the SDK outputs do not replace traditional animation, but serve to drive a character in real-time.

Faceware has devoted seven years to character-to-character communication. However, this collaboration with Cloud Imperium is their team's first opportunity to integrate their real-time system into the engine of a game, which they started about a year ago, allowing characters to interact and emote in real time, live-driven by the players themselves. It proved to be an engineering challenge, of course, which Faceware's developers felt they could develop best not by relying on the gamer's webcam but with a dedicated device with a certain specifications.

Frame-to-Frame Analysis

FaceWare 02

The result so far is the protoype motion sensor device mentioned above, made specifically for facial tracking. Its image sensor records at 60 fps. A high frame rate was chosen because the motion is derived by analysing the expression in one frame and comparing it to the expression in the next frame, producing an animated result over time. The sooner that next frame is captured, the smoother the animation will be. Thus, while a webcam running at 25 or 30 fps will certainly work, it won't achieve as refined a quality as 60 fps capture, in which the frames follow one another quickly enough to capture word formation as the player speaks.

Another advantage of Faceware's device is low light performance, because most people using it will be in a room with little other light than what shines from their computer screen. Therefore, the motion sensor has a 1/3 inch 3.4MP image sensor capable of 720p or 1080p video and 60 fps plus low light performance – all considerably better than a webcam.

Most webcams also have another drawback - a wide-angle lens resulting in images in which the player's face appears fairly small within the frame, allocating fewer pixels to the face. Faceware's high resolution glass lens records a single-person straight-on view, so that the face occupies much more of the screen and more pixels are allocated to facial detail.

FaceWare 04

In-game Camera

Meanwhile, Cloud Imperium Games is also using the detailed head position information to output accurate head-tracking to drive the in-game POV camera. This improves situational awareness by allowing players to look around the cockpit without extra hardware or peripherals, and can be combined with zoom in-game..


Chairman and CEO of Cloud Imperium Games Chris Roberts said, “The ability to to detect and stream the facial movements of players in real-time means that we’ll be able to deliver the range of human emotion, not just voice. Our players’ facial expressions will be translated onto their avatars’ faces. Combined with a player’s voice correctly positioned in the virtual world, this system gives us the chance to create extremely lifelike player-to-player communication.”   www.facewaretech.com