Let's Dance
December 13, 2017

Let's Dance

Across a 25-year career, CG veteran Remington Scott has worked on many projects that undoubtedly redefined what was possible within the realm of performance capture. He supervised the team that brought Gollum to life in the Academy Award-winning film  The Lord of the Rings: The Two Towers ; directed performance capture for the groundbreaking sci-fi film  Final Fantasy: The Spirits Within ; and delivered mocap pipelines for films such as  Spider-Man 3 Superman Returns , and  Beowulf .

It was Scott’s experiences on the performance-capture set of Call of Duty: Advanced Warfare, however, that opened his mind to the potential of virtual reality.

“I was just a few feet from actors like Kevin Spacey, watching them deliver these amazing performances,” Scott explains. “I got to thinking, wouldn’t it be great if people could experience this like I’m experiencing it?’

Scott started to explore the potential of immersive narrative experiences powered by the latest in virtual reality. “I wanted to transport audiences into another world, and more importantly, into another character,” he says. “I wanted them to relate with the experience on a deeper, more emotional level. VR was the gateway to that.”

Scott teamed up with Advanced Warfare’s writer, John MacInnes, to form MacInnes Scott, a new studio dedicated to achieving this very goal. 

“Grace”formed one of the new studio’s first VR experiences. Designed as both an R&D project and a proof of concept, it explores the potential of VR music videos, featuring a beautiful performance from a photoreal android dancing around a beam of light.


A Creative Spark

“Grace” was initially built as an ongoing, self-funded technology test for the MacInnes Scott team, enabling them to evaluate the latest in virtual and augmented reality, while developing something new and exciting in the process. 

Designed to be viewable on 2D screens at 4k resolution, as well as through augmented--reality devices, “Grace” is primarily built to be experienced within the immersive confines of the HTC Vive VR headset.

Set in a frigid forest environment, “Grace” begins with a glowing ball of light dancing through the trees, which the viewer can follow freely by looking in all directions. As the light source glides closer, it suddenly darts into the ground below. When the viewer looks down, there is Grace, the titular robotic heroine lying lifeless before the person.

Once enveloped by the light, Grace activates, stirring to consciousness with a glitchy rhythm as her systems power up. She struggles to reach toward the hovering orb, when it transforms into a vertical beam of light she grasps before dancing around it, singing “Until We Go Down” by electro-pop artist Ruelle.

Watching this woman spring to life has proven to be a strongly emotional experience: Scott reveals that some VR viewers of “Grace” have removed the headset to reveal tears in their eyes.

Revealing Emotion

According to Scott, Faceware Technologies’ motion-capture tech helped bring the music video’s facial performances to life, enabling the studio to go from initial capture to immersive VR experience in very little time at all.

Although Grace has a robotic body, it’s her human face that establishes the emotionally resonant aspect of the experience. And it was Faceware’s hardware and software solutions that were key in capturing the subtleties of Grace’s emotions, building a sympathetic connection between digital creation and human viewer, Scott notes.

“The face needed to be the most compelling part of this emotional experience –
viewers needed to read her expressions,” affirms Scott. “‘Grace” was essentially a tech demo, but we also wanted it to be more than that. It had to be something that had a life, which you could completely relate to.”

Constructing Grace meant relying on the talents of a real-life model: Grace Holiday, a champion pole dancer, who choreographed and performed the dance routine that was motion captured. 

Pole dancing requires skin to be in contact with the metal at almost all times for stability and grip, so it was essential that Holiday not be overloaded with markers or gear while performing the routine. To this end, Faceware’s lightweight ProHD Headcam software was able to complete the job without hassle.

Using a Faceware ProHD Headcam to capture Holiday’s facial performance and reactions, Scott could record local data while transmitting it to nearby computers, enabling the team to direct her performance and vocalizations in real time. 

Meanwhile, a Vicon system was used to capture Holiday’s body movements in a separate zone. The two systems worked together simultaneously throughout to capture the entire performance.

At times, the headset was removed to allow for unencumbered movement while dancing around the pole, but the MacInnes Scott crew could fill in any blanks after the fact. Otherwise, Scott says the facial performance witnessed in “Grace” is the real mocapped performance without blending or modification – simply transplanted onto the digital body of a dancing android.

“It’s her; it’s a pure human performance of an incredible athlete taken to a whole other level,” Scott says. “You feel like you’re in a room with somebody performing something that most people will never be able to do. That’s one of those things that just gives you energy when you’re watching it.”


Graceful Results

Once the performance was captured, the group easily transported the facial data into Faceware’s Analyzer and Retargeter, and applied Holiday’s facial expressions to the animation rig, with help from Epic’s Unreal Engine. 

“It was a one-man show,” recalls Scott. “What was extremely impressive right out of the gate were the eyes; Faceware’s capture tech nailed them 100 percent. I did no work on those eyes.

“I was really thankful because Grace is tracking this ball as it’s moving around in 3D, and her head is moving,” he continues. “To deal with that manually would have been extremely difficult.”

While Scott has ample experience using all sorts of performance-capture solutions, “Grace” marked his first time using Faceware. He says he was not only pleasantly surprised by its ease of use, but also how it enabled him to keep a lean team on this quick-turnaround project. In fact, Scott had a first pass completed within the day and spent only two more days on fine-tuning elements such as the lips and mouth.

Facing the Future

“Grace” made its debut at Los Angeles’ VRLA expo in 2016, the world’s largest virtual- and augmented-reality convention. HTC has since licensed “Grace” to demonstrate the Vive’s potential at pitches and events. MacInnes Scott also plans to release “Grace” online, enabling Vive users across the world to experience it, and potentially inspiring more creators to push creative boundaries within VR.

MacInnes Scott continues to innovate in the ever-expanding VR and AR worlds. At the time of writing, the studio had other VR experiences in the works, including E.K.I.A. Osama Bin Laden and Too Soon?, a weekly parody series about Donald Trump in the White House.