Skip to main content

The rise of the virtual singer: the making of Fox's Alter Ego

Alter Ego is a singing competition like no other. At first, Fox’s new 11-episode primetime show might feel familiar. There are the singers, who are competing to be recognized as professional stars; the famous judges including Grimes, will.i.am, Nick Lachey, and Alanis Morissette; and even the live audience of 200 fans. There’s just one very important difference: every single competitor performing live on Alter Ego’s stage is an avatar.

And the avatars aren’t just giving singers a new persona to perform with. They’re also reinventing what a singing competition can be, turning what could just be another contest into a striking experience that shreds the stereotypes of pop acceptance, from the benefits of beauty to the age of a performer, in favor of a digital persona and the unique voice that comes out of it.

It’s a brave new way to change a familiar format. But doing something so groundbreaking doesn’t come without its challenges.

“I didn’t think it was going to be possible to do,” begins Michael Zinman, Lulu AR Executive Producer and Co-Executive Producer of Alter Ego. “Not because you couldn’t have an avatar on stage singing in a motion body capture suit. But how do you do 10 of these performances in a day, keep it on budget, and keep it transparent, as elementary as you can through the network, and through the producers on stage—no different to a show with live-action people.”

Creating the next generation of pop stars

If Zinman was going to keep attention on the avatars, he needed the mocap locked in and offstage. For that, he brought in Silver Spoon Animation, who would not only handle the capture process, but the characters, as well. Twenty characters would be needed in total, each of which needed to emotionally connect to an audience and withstand live judging, all in a matter of six weeks.

To start, a team of 50 artists began by working with each contestant to design a character that accurately reflected their unique stage persona with four different wardrobe options and individual visual effects.

Every performance on the show would then be captured live using 14 cameras, eight of which were fitted with stYpe tracking technology. Avatar data including eye color, height, and special effects, as well as motion-capture data, lighting data, and camera data were then sent to a hub of Unreal Engines behind the main stage.

The result was a virtual avatar that would always accurately reflect the motion-captured performance behind the scenes. When a contestant cried, the avatar would cry too. And blush. And run their fingers through their hair. All final performances could then be seen directly through on-set monitors, making it easy for the live audience to engage with the final characters and be immersed in their journey, rather than feeling removed from the story due to the computer-generated imagery.

“DMX was so important for this show because it gave the lighting programmer, director, and production an unbelievable amount of control over the characters,” adds Dan Pack, Managing Director of Silver Spoon Animation. “The great thing about Unreal Engine vs. traditional rendering pipelines is that we can preview incredible changes that would normally take a lot of time in post to render…so we are able to change character hair color, eye color, texture, and control the effects, all through this DMX control panel. We pushed DMX on this show further than it has ever gone before in a real-time setting.”

“In order to pull off a show like Alter Ego, you need Unreal Engine’s DMX capabilities; this is the vocabulary and protocol we all speak and the only way to bring in a network competition show on schedule,” says Zinman. “When lighting directors can program virtual and real studio lights on one console, and the characters can be programmed on another, you run through rehearsals much faster. This is how we were able to do 10 unique performances each day.”

In tandem, Silver Spoon was tapping Unreal Engine’s Live Link capabilities to stream data from over ten mocap instances into the avatars. With virtually no latency, Unreal Engine is able to render high-quality animation captured from the actors onto the avatars in real time—as it happens. Live Link is also used for facial animation via the dedicated Live Link Face app, which enables facial performances to be captured via an iPhone or iPad and rendered in real time in the engine. Using Live Link removed the need for Alter Ego to write a dedicated plugin for the Vicon data packets, giving the team a free-flowing way to animate their skeletal meshes.

A more inclusive future

By the time production was complete, Silver Spoon and Lulu AR had helped create more than 80 performances, or 12 hours’ worth of content, all with no post-production required.

“The pressure on this show was huge. Nothing like it has been done at this scale and in a prime time capacity ever before,” Pack explains. “Unreal Engine is at the point where we can do real-time particle simulation, real-time hair simulation, and real-time lighting directly within the engine. We can pull off these really incredible effects, effects that normally would have to be done in post-processing, in real time.”

The result? For artists that may have been too scared to go on stage or feel otherwise unworthy of being on TV—it’s a chance to finally succeed, with virtual avatars powering a new form of self expression. For audiences, it’s a new path forward, where physical rules give way to boundless creativity. If Alter Ego proves anything, it’s that virtual avatars are ready for primetime. So what’s stopping the next season, or next show, from going even bigger?

Stay tuned!

One Comment