iPi Soft has upgraded their markerless iPi Motion Capture system to support the Unreal Engine development tools for real time content creation. The updates include a new plug-in that allows real-time motion tracking, motion transfer to 3D characters and animation streaming into Unreal Engine.
Located in Moscow, iPi Soft develops motion capture software that uses image processing and computer vision algorithms to recognise and track the human body without markers. The company’s software is used to digitise the movement of a human skeleton, giving expression to 3D characters for video games and CG films, and supporting medical, military and other applications.
iPi Soft’s CEO Pavel Sorokin said that closer integration with Unreal Engine via the new plug-in gives motion capture users the ability to stream tracking results to Unreal in both the live feedback mode and later on in the offline tracking mode, allowing users to see how motion will look on their character in their particular 3D setting, via Unreal Engine. Also very useful is the fact that users can have both iPi Mocap Studio and the Unreal Engine editor open simultaneously, which means the motion can be edited, and the animation changes viewed, in real-time.
Before this development, iPi Mocap users working in Unreal Engine would have to export animation from iPi Studio to FBX or BVH files, then import assets to Unreal. The purpose of a tighter integration into Unreal is more flexibility for iPi Mocap content creators, who will be able to create game, virtual production and other projects quickly, and with more accurate results, without writing custom code for optimisation.”
Digital agency New Discovery, based in Mexico and the US, took part in the iPi Soft beta-testing program of the iPi Mocap/Unreal Engine integration, using it for a VR documentary titled ‘Northern Route, Border Identity’. Funded by the Secretary of Cultural Affairs of the State of Chihuahua, Mexico, the project takes an empathetic look at the refugee experience that has gradually become part of the socio-economic fabric of their region at the Mexican-US border.
“The fidelity of the iPi Mocap data is impressive,” said Jonalex Herrera, CEO and co-founder of New Discovery. “Although we used the software with two Kinect Azure sensors instead of cameras, limiting us to a smaller space and slower more deliberate motion, the results were very accurate. Best of all, we were able to push animation streams directly into Unreal to realistically share details of the refugees’ journey in search of a better future.” www.ipisoft.com