IKinema Developing Natural Language Animation Interface
August 10, 2015

IKinema Developing Natural Language Animation Interface

GUILDFORD, UK — IKinema (www.ikinema.com), a developer of realtime inverse kinematics technology, announced at SIGGRAPH a brand new concept in animation that will enable anyone to create animation using descriptive commands based on every day language.
“The idea is very simple,” explains IKinema chief executive Alexandre Pechev. “Imagine being able to drive the actions of your favorite animation hero in realtime using simple written or voice commands such as 'walk, turn left and then run to the red chair.' IKinema is turning that dream into reality.” 

The technology, currently in prototype, is part of a two-year project code-named INTiMATE that is backed by the UK government’s Innovate UK program. 

“We’ve developed an innovative way to convert libraries of animation into a run-time rig using a natural language interface,” says Pechev. “The result is a seamless transition from one animation to another just by using normal, every day words.”

Virtual Reality technology such as Magic Leap and Microsoft HoloLens is fuelling the demand for advanced ways of producing animation. At the same time, the expectation is for these new methods to be simple enough for a mass audience to use. 

“What is simpler than using natural language and speech?” asks Pechev. “INTiMATE is so easy to use — anyone can bring in a character and animate from a vast library of cloud animation just by describing what they want their character to do.”

Designed to be accessible to the masses yet powerful enough for the professionals, INTiMATE has many applications, spanning pre-production, games, virtual production, architecture, training & simulation, as well as virtual and augmented reality. Although the technology is aimed at a mass audience, Pechev said it would create vast opportunities for professional animators too. 

“Since INTiMATE works with a library of animations, the animator plays a central role in providing the required look and feel of the library. IKinema just seamlessly blends one action with another to generate smooth and continuous motion in the virtual world and then adapts the action to the environment.”

The technology is expected to become commercially available in 2016 and the aim is to make an SDK available to any animation package. Currently, the company has a working prototype and has engaged with top studios for the purpose of technology validation and development.