NVIDIA RTX-Powered VRWorks 360 Video SDK Offers New Features
Stanley Tack, NVIDIA
October 18, 2018

NVIDIA RTX-Powered VRWorks 360 Video SDK Offers New Features

Creating high-quality,  360-degree stereo video experiences just got easier with the latest release of NVIDIA’s VRWorks 360 Video SDK.

Version 2.0 of the SDK, announced today at Adobe MAX, adds support for NVIDIA CUDA 10 and NVIDIA’s latest Turing-based Quadro and GeForce RTX GPUs. It accelerates stitching of 360-degree video on RTX-powered GPUs by up to 2x relative to Pascal-based GPUs. And it offers a host of new features that make it easier to stitch and stream 360-degree mono- and stereoscopic video.

VRWorks 360 Video features are aimed at content creators and VR application developers and bring a new level of performance, immersiveness and responsiveness to capture, stitch and stream 360 mono or stereo in real-time or offline. This version paired with Turing-based GPUs brings many new technologies including ambisonics audio, improved mono stitch, depth-based mono stitch, a region of interest stitch, moveable seams and Warp 360.

Release highlights include:

Ambisonic Audio – a technique to record, mix and playback 3D 360 degree audio. The new pipeline enables 3D, omnidirectional audio such that the perceived direction of sound sources change when viewers modify their orientation.

New Mono Pipeline – a new stitching pipeline that brings additional robustness, improved image quality without additional processing time as the older method on a single GPU. It offers multi-GPU scaling on Turing-based GPUs which contribute up to 2x scaling compared to old mono pipeline.

Depth-based mono stitch – a new stitching pipeline that uses depth-based alignment to improve the stitching quality in scenes with objects close to the camera rig and improves the quality across the region of overlap between two cameras.

Region of interest stitch – enables adaptive stitching by defining the desired field of view rather than stitching a complete panorama. This opens up new use cases such as 180-degree VR and can reduce execution time and improve performance.

Moveable seams – a feature that enables creators to adjust the seams location in the region of overlap between 2 cameras to preserve visual fidelity in the region of overlap particularly when objects are close to the camera.

Warp 360 – provides highly-optimized image warping and distortion removal by converting images between a number of projection types, including perspective, fisheye and equirectangular. It can transform equirectangular stitched output into projection format such as cubemap to reduce streaming bandwidth leading to increased performance.