At the State of Unreal keynote at GDC 2023,
Epic Games revealed their long-term vision for the future of content creation, distribution, and monetization — including how they are laying the foundations for a connected, open ecosystem and economy that will enable all creators and developers to benefit in the metaverse.
Unreal Editor for Fortnite now in Public Beta
Since 2018, it has been possible for Fortnite players to create their own islands in Fortnite thanks to the Fortnite Creative toolset. Today, there are more than one million of these islands, and over 40% of player time in Fortnite is spent in them.
What if creators and developers had more powerful tools and greater creative flexibility to reach Fortnite’s huge audience of more than 500 million player accounts? That becomes possible with
Unreal Editor for Fortnite (UEFN), launched in Public Beta today at the State of Unreal.
UEFN is a version of
Unreal Editor that can create and publish experiences directly to Fortnite. With many of Unreal Engine 5’s powerful features at their fingertips, creators and developers have a whole world of new creative options for producing games and experiences that can be enjoyed by millions of Fortnite players.
What’s more, UEFN provides an opportunity to use the new programming language Verse for the first time. Aimed at getting UEFN creators up and running with the ability to script alongside existing Fortnite Creative tools, Verse offers powerful customization capabilities such as manipulating or chaining together
devices and the ability to easily create new game logic. Verse is being designed as a programming language for the metaverse, with upcoming features to enable future scalability to vast open worlds built by millions of creators for billions of players. Verse is launching in Fortnite today, and will come to all Unreal Engine users a couple of years down the road.
At the State of Unreal, Epic also showed a brand new GDC demo that tests the limits of what can be built with UEFN today. The UEFN demo showcases a variety of key UEFN features including Niagara, Verse, Sequencer, Control Rig, custom assets, existing Creative devices, and custom audio.
The demo includes three key parts: an opening section highlighting how to enhance existing Fortnite Creative devices using Verse, a deeper dive into the editor, and an exciting boss battle to close out the segment.
Epic showed the demo live running on Fortnite servers, and played it on PC.
Creator Economy 2.0
UEFN is being launched alongside Creator Economy 2.0, and in particular, engagement payouts — a new way for eligible Fortnite island creators to receive money based on engagement with their published content.
Engagement payouts proportionally distribute 40% of the net revenue from Fortnite’s Item Shop and most real-money Fortnite purchases to the creators of eligible islands and experiences, both islands from independent creators and Epic's own such as Battle Royale.
Dig into UEFN, Verse, and the Creator Economy 2.0 in Epic's
blog post.
Fab — building a unified 3D marketplace
In the old days, every game developer built all of the content in their product from the ground up. Increasingly, content marketplaces such as Unreal Engine Marketplace and Unity's Asset Store have provided huge libraries of content which game developers can license from independent content creators and use in their games.
This enables small teams to produce games more quickly and with a lower budget, while funding the growth of independent content creators distributing their work through marketplaces.
Epic believes this trend will grow significantly as creators of experiences across Fortnite, Roblox, Minecraft, and other 3D worlds look to these marketplaces as sources of metaverse content, and as players increasingly build out and customize their own 3D spaces online.
Fab will bring together a massive community where creators will earn an 88% revenue share, and the marketplace will host all types of digital content including 3D models, materials, sound, VFX, digital humans, and more; supporting all engines, all metaverse-inspired games which support imported content, and the most popular digital content creation packages.
Epic hopes that as millions more come online and learn how to become 3D creators, the community can work together to make Fab accessible for all types of worldbuilders. Want a preview? Check out the
Alpha version of Fab as a plugin for UEFN today which enables creators to build new games and experiences that can be published in Fortnite.
A peek at what’s coming in Unreal Engine 5.2
Unreal Engine is the backbone of the Epic ecosystem. The launch of
UE5 last spring put even more creative power in the hands of developers. Since that release, 77% of users have moved over to Unreal Engine 5. Unreal Engine 5.2 offers further refinement and optimizations alongside several new features.
Electric Dreams is a real-time demonstration that showcases new features and updates coming as part of the 5.2 Preview release, available today via the Epic Games Launcher and GitHub.
In the live demo at the State of Unreal, a photorealistic
Rivian R1T all-electric truck crawls off road through a lifelike environment densely populated with trees and lush vegetation built with
Quixel Megascans, and towering craggy rock structures using Quixel MegaAssemblies.
The R1T’s distinct exterior comes to life in the demo thanks to the new Substrate shading system, which gives artists the freedom to compose and layer different shading models to achieve levels of fidelity and quality not possible before in real time. Substrate is shipping in 5.2 as Experimental.
The state-of-the-art R1T showcases the latest technology and vehicle physics, with the truck’s digi-double exhibiting precise tire deformation as it bounds over rocks and roots, and true-to-life independent air suspension that softens as it splashes through mud and puddles with realistic fluid simulation and water rendering.
In addition, the large open world environment is built using procedural tools created by artists that build on top of a small, hand-placed environment where the adventure begins. Shipping as Experimental in 5.2, new in-editor and run-time Procedural Content Generation (PCG) tools enable artists to define rules and parameters to quickly populate expansive, highly detailed spaces that are art-directable in manner and work seamlessly with hand-placed content.
MetaHuman Animator: high-fidelity performance capture
Photorealistic digital humans need high-quality animation to give truly believable performances however, and the expertise and time to create this has previously been challenging for even the most skilled creators.
That’s all going to change this summer with the launch of
MetaHuman Animator — a new feature set for the MetaHuman framework. MetaHuman Animator will enable creators to reproduce any facial performance as high-fidelity animation on MetaHuman characters.
Users can simply use an iPhone or stereo helmet-mounted cameras to capture the individuality, realism, and fidelity of the performance, transferring its detail and nuance onto any MetaHuman.
Offering a faster, simpler workflow that anyone can pick up and use —regardless of animation experience — MetaHuman Animator can produce the quality of facial animation required by AAA game developers and Hollywood filmmakers, while at the same time being accessible to indie studios and even hobbyists.
The future is bright for developers and creators
The State of Unreal featured breathtaking demos from Epic partners including Cubit, Kabam, CI Games, and NCSoft that showcased what’s possible right now with Unreal Engine 5 and other Epic tools. Awe-inspiring projects like these are proof that an exciting new era of virtual worlds, games, and experiences is already beginning to dawn.
Armed with powerful tools, a vast marketplace for digital content, unfettered access to a global audience, and a new equitable economy, the opportunities for creators are limitless.
Watch the full keynote: