CARY, NC — Epic Games (epicgames.com) has released Unreal Engine 4.23, a new version featuring next-generation virtual production tools that will allow users to achieve final pixel quality in realtime, as well as developer-focused tools to maximize performance.
CARY, NC — Epic Games (epicgames.com) has released Unreal Engine 4.23, a new version featuring next-generation virtual production tools that will allow users to achieve final pixel quality in realtime, as well as developer-focused tools to maximize performance.
Filmmakers can now achieve final shots live on-set. With LED walls powered by nDisplay, filmmakers can bring real-world actors and props into a photoreal Unreal Engine environment background, capturing interactive and accurate lighting and reflections in-camera. They can also switch to a digital green screen for realtime compositing in UE4. Additional virtual production capabilities include the ability to interactively and collaboratively explore digital UE4 environments with new VR scouting tools, leverage enhanced Live Link tools for realtime data streaming, and remotely control UE4 from an iPad or other device for increased on-set flexibility.
First introduced in UE 4.22, ray tracing has received numerous enhancements to improve stability and performance, and to support additional material and geometry types, including landscape geometry, instanced static meshes, procedural meshes, and Niagara sprite particles. These improvements deliver a better out-of-the-box experience and end results for users.
With Chaos, artists can fracture, shatter, and demolish massive-scale scenes at cinematic quality with great levels of artistic control. Chaos is also integrated with the Niagara VFX system to trigger the generation of secondary effects, such as dust and smoke.
Unreal Engine 4.23 introduces both Streaming and Runtime Virtual Texturing, where large textures are tiled and only the visible tiles loaded, thereby reducing texture memory overhead for light maps and detailed artist-created textures, and improving rendering performance for procedural or layered materials respectively.
The new Unreal Insights system collects, analyzes, and visualizes data on UE4 behavior for profiling, helping users understand engine performance from either live or pre-recorded sessions. As well as tracking various default sub-systems and events, users can also add their own code annotations to generate trace events.
Support for the Microsoft HoloLens 2, initially released in beta in May, is now production-ready. Features include streaming and native deployment, emulator support, finger tracking, gesture recognition, meshing, voice input and spatial anchor pinning.