VANCOUVER — At SIGGRAPH 2018, Mo-Sys is showcasing its new StarTrackerVFX tool at Waskul Entertainment’s StudioXperience broadcast studio and technology space. The patented optical camera tracking system is being used to demonstrate how it brings virtual and real worlds together in realtime, removing the complexities, time and budget constraints associated with virtual production. The StudioXperience space is sponsored by HP and Nvidia.
Mo-Sys is demonstrating a realtime composite of interviews filmed on a green screen and rendered by the Unreal Engine, which reduces the need for costly fixes in post. The green-screen footage and camera tracking data captured is then feed into The Foundry’s Nuke compositing suite, where VFX are added in just 20 minutes – completing the full-VFX workflow on the fly.
In a separate demonstration, the StudioXperience stage is showcasing camera tracking from Mo-Sys’ StarTrackerTV solution for TV studios and broadcasters, combined with realtime rendering from Zero Density. StarTrackerVFX simplifies an otherwise complex process of aligning a moving camera within the virtual environment. Its direct plug-in for the Unreal Engine enables users to film actors in front of a green screen and effortlessly immerse them in photorealistic environments. The camera tracking data is recorded with a timecode stamp and can be exported as FBX camera data for post production, re-rendering and visual effects.
StarTracker is an upward facing sensor that attaches to any broadcast or film camera. It points at the ceiling, rather than looking forward and uses retro-reflective stickers – also known as stars – as a reference. These are robust, inexpensive, easily installed and offer immunity to changes in light. A one-time, 30-minute procedure auto-maps the sensors position and references it to the real world, after which no further star calibration or ‘homing’ is required.