Our understanding of history mostly comes from artifacts, but what if we could see the expressions on the face of an ancient human? What if we could actually see them move their facial muscles to reflect surprise, or anger, or fear — or even to smile at you? Those were the questions posed by Sofija Stefanović, Professor of Physical Anthropology in the Department of Archeology at the University of Belgrade, and her team when they set about creating a trailblazing reconstruction of the face of a 10,000-year-old shaman from the famous Lepenski Vir archeological site in Serbia.
Also conveniently located in Serbia is 3Lateral — a leading developer of digital humans technology, and now part of Epic Games — who partnered with Stefanović on the project. The team at 3Lateral has been working alongside colleagues from Cubic Motion — another member of the Epic Games family — and the Unreal Engine team, to develop the MetaHuman framework, which opened up Early Access to its free, cloud-based app MetaHuman Creator last year. MetaHuman Creator enables anyone to create unique high-fidelity digital humans, derived from its database of scanned data, in minutes.
The latest MetaHuman release, however, not only brought new features to MetaHuman Creator, but also launched a new MetaHuman Plugin for Unreal Engine. The first major feature of that plugin is Mesh to MetaHuman, sharing the core technology that was used to reanimate the shaman. This exciting development, now freely accessible to everyone, enables you to to create a MetaHuman from your own custom model and complete similar projects without expert knowledge.
Physical facial reconstruction using traditional techniques
The Lepenski Vir settlement in Eastern Serbia, on the banks of the Danube River, was first discovered in the 1960s. It is a site of huge historical significance as it showcases the early stages of the development of European prehistoric culture. Around 500 skeletal remains were unearthed at the site, including the skeleton that has come to be known as the ‘shaman’ thanks to the unusual cross-legged lotus position that it was found sitting in. The shaman is thought to have lived around 8,000 BC. From the skeleton, the team of experts was able to determine his height, weight, and even the fact that he lived on a largely seafood-based diet.
To begin the reconstruction, the team needed to create an exact physical replica of the prehistoric skull in order to preserve the original’s integrity. Under the guidance of archaeologist Jugoslav Pendić from the BioSense Institute in Novi Sad, Serbia, the team began recreating the shaman’s prehistoric facial features by capturing hundreds of 2D images using a full-frame camera. These were imported to RealityCapture to form a 3D virtual model, which was then 3D printed to produce a physical model.
The replica skull was then passed to Oscar Nilsson, a forensic artist and archaeologist from Sweden, who is an expert in reconstructing models of ancient faces for museums around the world. Nilsson was able to begin creating a forensic facial reconstruction, adding muscle and skin layers with clay, with the thickness determined by the gender, age, ethnicity, and estimated weight of the subject.
Turning a static mesh into an animatable model
To bring the actual shaman model to life, the clay reconstruction — still without skin textures and hair — was again scanned using the Peel scanner and reconstructed with RealityCapture to create a digital model. With a basic texture for skin and eyes applied (a prerequisite for the next step), it was time to put the mesh through the Mesh to MetaHuman process.
Using automated landmark tracking in Unreal Engine 5, Mesh to MetaHuman fit the MetaHuman topology template to the shaman scan. This new mesh was then submitted to the cloud, where it was matched to a MetaHuman with similar facial geometry and proportions to the shaman, and automatically bound to the MetaHuman facial rig. The technology used MetaHuman Creator’s extensive database of scans of real human expressions to produce ‘statistically estimated expressions’ for this ancient face, while preserving the deltas to retain the original likeness. To find out more about how the process works, check out this tutorial.
Once the process was complete, the character could be opened in MetaHuman Creator. Immediately, the team could press the play button and see the shaman come to life, using the several preset animations the application offers.
Refining the picture with MetaHuman Creator
Next, in a collaborative session with archaeologists, forensics, and a MetaHuman character artist, the team could narrow down the appearance of shaman’s facial hair and skin properties inside MetaHuman Creator. Analysis of available DNA suggested that he would have had ‘intermediary-dark’ skin, dark hair, and brown eyes. “We can estimate the color of the skin, the hair, and eyes with more than 90 percent accuracy,” says Stefanović.
Wrinkles and gray hairs were added based on his estimated age; teeth were adjusted based on the skull; the style of his hair and beard were based on plausible tools (such as shells) available in the period 8,000 BC.
With the Shaman’s digital appearance complete, Nilsson could then apply the same styling to the physical model to complete the traditional 3D reconstruction.
“All that back and forth in MetaHuman Creator was really brilliant, because normally I would do this work myself by hand, and that is very expensive and takes a lot of time,” says Nilsson. “But to be able to do this digitally is really a game changer.”
A real-time interaction with history
The team then brought the digital model into Unreal Engine, where they created a virtual environment with realistic lighting. The shaman could then be animated by hand using the MetaHuman facial rig, or using the free Live Link Face app that lets you use an iPhone or iPad to capture facial animation and stream it onto a character in Unreal Engine in real time.
The virtual and physical models were developed in tandem and finally unveiled at the Dubai Expo 2020 by national platform Serbia Creates, who supported the project realization. At the showcase, visitors were able to capture their own expressions via iPhones and see them mirrored on screen in real-time animation by the digital shaman.
“When nobody's interacting with him, we let him yawn and stretch his face by borrowing some of the animation from his fellow MetaHumans,” says Adam Kovač, Solutions Specialist at 3Lateral. “It was pure joy to see him smile.”
Stefanović agrees. “Never before have we had the opportunity to see what ancient people look like when they are showing their emotions,” she says.
The interactive setup is now exhibited at the National Museum of Serbia in Belgrade, alongside the institution’s collection of Lepenski Vir artifacts.
While the MetaHuman framework has received considerable interest within the gaming and film industries, the Lepenski Vir shaman offers a glimpse of how the technology can support innovative projects in more traditional fields like archaeology and forensics. The team hopes that the groundbreaking reconstruction will fuel the imagination of other pioneers to experiment in different spaces. “We see teams exploring uses with medicine, automotive, and even psychology research,” says Kovač. “So, it's really up to the users where this goes next.”
Watch Unreal Engine's video
here or visit their
blog post to learn more.