With each release of Unreal Engine, Epic aims to bring features that feel like the future of virtual production into the present for filmmakers to easily pick up and use. In the 4.25 release, this meant building something that ships alongside instead of inside the engine through a new iOS app, Live Link Face for Unreal Engine, available starting today on the App Store.
Live Link Face streams high-quality facial animation in real time from a user’s iPhone directly onto characters in Unreal Engine. The app’s tracking leverages Apple’s ARKit and the iPhone’s TrueDepth front-facing camera to interactively track a performer’s face, transmitting this data directly to Unreal Engine via Live Link over a network. Designed to excel on both professional capture stages with multiple actors in full motion capture suits as well as at a single artist’s desk, the app delivers expressive and emotive facial performances in any production situation.
Collaborative virtual production is a particular emphasis of the app, with multicast networking to stream Live Link data to all machines in the Multi-User Editor session simultaneously in order to minimize latency. Robust timecode support and precise frame accuracy enable seamless synchronization with other stage components like cameras and body motion capture. Live Link Face also has Tentacle Sync integration, which allows it to connect to the stage master clock using Bluetooth, ensuring a perfect editorial lineup with all of the other device recordings from the shoot. Sophisticated productions can also make use of the OSC (Open Sound Control) protocol support that lets external applications control the app remotely to do things like initiate recording on multiple iPhones with a single click or tap.
Live Link Face’s feature set goes beyond the stage and provides additional flexibility for other key use cases. Streamers will benefit from the app’s ability to natively adjust when performers are sitting at their desk rather than wearing a head-mounted rig with a mocap suit, as Live Link Face can include head and neck rotation data as part of the facial tracking stream to provide more freedom of movement for their digital avatars with just the iPhone. Animators can also take advantage of the option to record both the raw blendshape data (CSV) and a front-facing video (MOV), each striped with timecode, to use as reference material for the performance if further adjustments need to be made in-engine.