Spotlight
Issue: Volume 40 Issue 1: (Jan/Feb 2017)

Spotlight

AMD Introduces Vega Architecture

Star Wars

AMD recently unveiled preliminary details of its forthcoming GPU architecture, Vega. The brand-new high-performance GPU architecture enables new possibilities in PC gaming, professional design, and machine intelligence that traditional GPU architectures have not been able to address effectively.

The Vega architecture’s memory subsystem enables GPUs to address very large datasets spread across a mix of memory types. The high-bandwidth cache controller in Vega-based GPUs can access on-package cache and off-package memories in a flexible, programmable fashion using fine-grained data movement.

At the core of the Vega architecture is a new, next-generation compute engine built on flexible compute units that can natively process 8-, 16-, 32-, or 64-bit operations in each clock cycle. These compute units are optimized to attain significantly higher frequencies than previous generations. Also, the new Vega pixel engine employs a Draw Stream Binning Rasterizer, designed to improve performance and power efficiency.

GPU products based on the Vega architecture are expected to ship in the first half of the year.

Unity Updated to Version 5.5

Star Wars

Unity has released Version 5.5 of its tool set, which includes new platforms and new opportunities with support for Microsoft Holographic (HoloLens) that is no longer in preview mode. There are also major improvements to the Particle System and Line Renderer components. The animation window has workflow improvements and improved performance for faster, more reliable iteration. Additionally, many of the new features aim to improve performance for delivering the best experience to users on all platforms. For example, Unity has added GPU instancing for Android and iOS and a new CPU Usage Profiler timeline view, as well as updated its physics engine to PhysX 3.3.3.

Reallusion Releases CrazyTalk Animator 3

Star Wars

Reallusion is offering CrazyTalk Animator 3 (CTA3), an update to its all-in-one solution that provides character design, scene building, and character motion tools for the creation of studio-level 2D animation on a PC or Mac. The software includes character templates, motion libraries, a 2D bone rig editor, face puppeteering, and audio lip-syncing tools that give users a high level of control when animating 2D talking characters for videos, the Web, games, apps, and presentations. Version 3 includes improved 2D character templates, offering a variety of character creation styles and options.

CTA3 costs $69 for the Standard version, $179 for the Pro release, and $299 for the Pipeline solution.

IKinema Rolls Out LiveAction 3.0

Star Wars

Real-time IK vendor IKinema has announced the third edition of LiveAction, its virtual reality and virtual production animation technology, used by 3D animation teams and directors for live performances to mass audiences.

IKinema LiveAction enables users to live-stream and clean motion-capture data taken during an actor’s performance while wearing a motion-capture suit. With the technology, there is no latency between the actor performing the motion and instant retargeting and viewing on the virtual avatar within Unreal Engine 4.

Features new to LiveAction 3.0 include automatic bone mapping, a template editor with custom name matching, skeleton matching on the target rig, improved workflow within the animation cleaning pipeline, support for custom mocap systems through the Mocap Streaming Protocol, and faster setup.

IKinema LiveAction 3.0 is available for the Windows platform; node-locked or floating licenses start at £2500 (nearly $3,100).

18 Scientific and Technical Achievements to Receive Academy Awards

The Academy of Motion Picture Arts and Sciences will bestow 18 scientific and technical achievements, represented by 34 individual award recipients as well as five organizations, at its annual Scientific and Technical Awards Presentation on Saturday, February 11.

Unlike other Academy Awards to be presented this year, achievements receiving Scientific and Technical Awards need not have been developed and introduced during 2016. Rather, the achievements must demonstrate a proven record of contributing significant value to the process of making motion pictures. Ten Technical Achievement Awards – Academy Certificates will be given, as well as eight Scientific and Engineering Awards – Academy Plaques.

A few of those receiving Certificates are:

Larry Gritz for the design, implementation, and dissemination of Open Shading Language (OSL). OSL is a highly optimized runtime architecture and language for programmable shading and texturing that has become a de facto industry standard. It enables artists at all levels of technical proficiency to create physically plausible materials for efficient production rendering.

Carl Ludwig, Eugene Troubetzkoy, and Maurice van Swaaij for the pioneering development of the CGI Studio renderer at Blue Sky Studios. CGI Studio’s groundbreaking raytracing and adaptive sampling techniques, coupled with streamlined artist controls, demonstrated the feasibility of raytraced rendering for feature-film production.

Parag Havaldar for the development of expression-based facial performance-capture technology at Sony Pictures Imageworks. This pioneering system enabled large-scale use of animation rig-based facial performance capture for motion pictures, combining solutions for tracking, stabilization, solving, and animator-controllable curve editing.

Nicholas Apostoloff and Geoff Wedig for the design and development of animation rig-based facial performance-capture systems at ImageMovers Digital and Digital Domain. These systems evolved through independent, then combined, efforts at two different studios, resulting in an artist-controllable, editable, scalable solution for the high-fidelity transfer of facial performances to convincing digital characters.

Kiran Bhat, Michael Koperwas, Brian Cantwell, and Paige Warner for the design and development of the ILM facial performance-capture solving system. This system enables high-fidelity facial performance transfer from actors to digital characters in large-scale productions while retaining full artistic control, and integrates stable rig-based solving and the resolution of secondary detail in a controllable pipeline.

A few of those receiving Plaques are: 

Marcos Fajardo for the creative vision and original implementation of the Arnold renderer, and Chris Kulla, Alan King, Thiago Ize, and Clifford Stein for their highly optimized geometry engine and novel raytracing algorithms that unify the rendering of curves, surfaces, volumetrics, and subsurface scattering as developed at Sony Pictures Imageworks and Solid Angle SL. Arnold’s scalable and memory-efficient, single-pass architecture for path tracing, its authors’ publication of the underlying techniques, and its broad industry acceptance were instrumental in leading a widespread adoption of fully raytraced rendering for motion pictures.

Vladimir Koylazov for the original concept, design, and implementation of V-Ray from Chaos Group. V-Ray’s efficient production-ready approach to raytracing and global illumination, its support for a wide variety of workflows, and its broad industry acceptance were instrumental in the widespread adoption of fully raytraced rendering for motion pictures.

Luca Fascione, JP Lewis, and Iain Matthews for the design, engineering, and development of the FACETS facial performance capture and solving system at Weta Digital. FACETS was one of the first reliable systems to demonstrate accurate facial tracking from an actor-mounted camera, combined with rig-based solving, in large-scale productions. This system enables animators to bring the nuance of the original live performances to a new level of fidelity for animated characters.

Steven Rosenbluth, Joshua Barratt, Robert Nolty, and Archie Te for the engineering and development of the Concept Overdrive motion-control system. This user-friendly hardware and software system creates and controls complex interactions of real and virtual motion in hard real time, while safely adapting to the needs of on-set filmmakers.