The film’s eponymous hero is getting married, considering fatherhood – and even attempting to persuade the Commonwealth of Massachusetts that he's legally a person. There's even a big musical number. Joined once again by Mark Wahlberg as John Bennett, the thunder buddies for life are entering into uncharted territory. It’s time to legalize Ted.
Yet despite the complexity of the action, the film’s motion capture was not the product of some multimillion-dollar facility. Instead it was produced live on set by Universal and MRC using Xsens’ lightweight MVN inertial motion-capture suits.
Credit: Iloura/Universal Pictures and Media Rights Capital. Copyright Universal Pictures.
Freedom to record anywhere
“Optical mocap wouldn't have been feasible given the demands of the production,” says Postvis Supervisor Webster Colcord, an industry veteran with a career spanning three decades. “We needed a portable system that could be quickly set up anywhere with minimal impact on the rest of the crew.”
Duplicating the original film’s capture process, Colcord and his team used a MVN system to record MacFarlane, who also voices Ted – as he performed the character live. The sensors were simply strapped over his normal clothing, leaving the director free to carry out his other duties.
For the big musical production, which sees Ted accompanying a large group of dancers, mocap was recorded with three separate professional dancers and combined to create Ted’s impressive moves. To pull this all together, Colcord turned to MVN Link: Xsens' next-generation lycra suit, which enables users to capture the motion of actors at up to 240Hz, and at distances of up to 150m.
“The dancers could run as far as they wanted without us ever having to constrain their distance,” says Colcord. “Not having to worry about staying within a volume [as with optical motion capture] is a great advantage – as is the fact that we don't have to worry about markers being occluded.”
While accurately recording the motion of a dancer with both feet off the floor would previously have been the sole preserve of optical capture, Xsens’ recent advances in inertial capture technology made even high-intensity sequences possible.
Freedom to use off-the-shelf tools
Another advantage of the MVN system is that it doesn't require specialist hardware to operate. Colcord and his team used a BOXX Technologies workstation, purchased for the first film, as their primary capture machine, with a couple of off-the-shelf ASUS Republic of Gamers laptops as back-up.
The team captured three streams of data, all synchronized by timecode: video for performance reference, the actual motion-capture data, and a recording off the video card of the live mocap streaming into Autodesk's MotionBuilder software, where it was retargeted onto the CG character.
“Xsens’ tools for streaming to MotionBuilder and outputting data in various configurations are simple and easy to use,” says Colcord. “We have an in-house postvis rig that we built for Ted, which is very lightweight and has a multilayered skeleton, which allows us to offset the mocap [in order to adjust the performance] without animation layers, and without being destructive to the original data.”
For the postvis – a rough version of the visual effects, used to define the look and timing of shots early in the creative process – the team used a set of standard software packages, including SynthEyes and MatchMover for camera tracking, Maya for animation work, and After Effects for compositing.
“After a typical [recording] session, editorial cut the video elements in as picture-in-picture with timecode burns,” says Colcord. “I've developed my own method for syncing up the data in Maya, based on timecode, and that has been pretty successful.”
The process complete, the assets were then passed to VFX facilities Iloura and Tippett Studio to create the final effects for the film, using the postvis as reference. “The synced data was passed along with minimal corrections,” says Colcord. “The VFX artists baked it onto their animation rig through a constraint set-up and did offsets on animation layers.”
Freedom to get higher
Thanks to the freedom afforded by Xsens' MVN wearable motion-capture systems, the creators of Ted 2 were able to push their creative process to new heights. MacFarlane was able to 'be' Ted live on set, enabling the other actors to react realistically to the bear, with the confidence that the final CG version would match every nuance of his performance.
And with Xsens' recent advances in inertial technology, MRC's postvis team was able to expand the boundaries of on-set capture, tackling shots that would previously have been impossible. “We had three very active dancers flipping, jumping, spinning, and more,” says Webster Colcord. “It was really fun to deal with the challenges of that situation – and a great test of the new hardware and software system.”