Groundbreaking VR experiences prove IKinema's LiveAction as the superior solving tech to deliver convincing motion in live animated performances within Unreal Engine 4
FARNHAM, UK, 5 January 2017 – IKinema – the leader in real-time inverse kinematics, today announced the third edition of LiveAction, the company’s Virtual Reality and Virtual Production animation technology, used by 3D animation teams and directors for live performances to mass audiences.
IKinema LiveAction enables users to live-stream and clean motion capture data, taken during an actors’ performance with the assistance of motion capture suits. The technology boasts zero lag – meaning there is no latency between the actor performing the motion, and instant retargeting and viewing on the virtual avatar. Characters move with authentic human or creature behaviours and interact with props in their virtual environments, all performed within Unreal Engine 4.
The technology is relied on by producers and animation studios because it ensures stable animation workflows, produces best quality solve from MoCap data, and quality previewing during live performances. Additionally, producers and directors have greater freedom to make quick and instinctive changes in real-time.
LiveAction streams and solves dozens of characters with zero lag
Groundbreaking Live VR Experiences
Hasbro, Facebook Live Broadcast: During the announcement launch of Monopoly Ultimate Banking game, LiveAction acted as the bridge between motion captured from an actor performing in studio, and retargeting the MoCap data directly to their 3D animated Mr. Monopoly, which moved within a cartoon stage. Running parallel with several technologies, LiveAction helped bring the Mr. Monopoly character to life by producing realistic mannerisms needed for a convincing performance, as well as an unbreakable pipeline during a live Q&A session with fans in August 2016.
Globo TV, 2016 Rio Olympics: IKinema’s LiveAction provided the Globo TV team with a fully reliable real-time pipeline to stream and retarget captured data taken from athletes – the likes of Usain Bolt, to their animated avatars for live broadcasts to Brazil’s nation throughout the Rio Olympics. Real presenters could analyse match replays, walk around and interact with virtual athletes and their environments during real-time, and all within virtual stadium scenes.
Ninja Theory, Hellblade: Senua’s Sacrifice: An award-winning production, LiveAction seamlessly delivered perfectly matched actor behaviour (on-set), resulting in convincing humanoid motion for the virtual Senua character (on-screen). The technology was used alongside cutting-edge and first-ever-seen VFX techniques. The show unfolded directly in front of a live audience, winning Best Real-Time Graphics & Interactivity Award at Siggraph, 2016.
"It gives us a huge sense of pride to know that LiveAction enables teams to deliver the most convincing live performances to audiences, whether it’s to tens, hundreds, or in some cases the worlds’ centre stage. The tech especially comes in to play when the pressure is on behind the scenes. The latest improvements we’ve made in LA 3.0 gives users sharper methods to produce the best quality data, as well as less repetitiveness by saving templates and reusing – these are huge time-saving components which is needed during a live show."
Ahmed Elhasairi | Technical Director | IKinema
New to LiveAction 3.0
Pricing and Availability
IKinema LiveAction 3.0 is available for Windows platform. Virtual Production users can purchase node-locked or floating licenses from £2500.00. Choose your level of license from the IKinema website.
See more articles
Please, introduce yourself: