The digital wall is a curved TV screen that’s 23 feet wide and 7 feet tall. “It creates this larger-than-life experience, at almost one-to-one scale, in the ‘real’ space,” says Dubrow.
Green screens, a visual effects technique that’s commonly used in movies or television, was an option for the demo space, but the team knew they wanted something more immersive. “We didn’t want the store to look like a film studio; we wanted the experience to feel like magic,” says Dubrow.
To achieve that magic, they needed technology that could perform real-time background removal, like a green screen, while also doing something more challenging: cleanly separating the person playing the game live from the broadcast of them in the game on the screen behind them. That’s because their mixed-reality demo would composite the outline of the player, positioned in front of the digital wall, into the virtual content.
That called for state-of-the-art image segmentation, a computer vision technique used to understand what is in a given image at a pixel level. Among other abilities, the system would have to accurately gauge the depth of spaces and execute deep body and gesture analysis of human silhouettes. “We needed to build a program that could tell the difference between two very similar objects in the same camera field — the human and the cast of the human that it is seeing,” said Dubrow. “But we were smart, right off the bat,” she adds. “From the very beginning, we partnered or brought in people who specialized in all the areas we needed to make this work.”
They joined forces with specialist studio Scout House to develop a custom segmentation program that leverages image recognition and deep learning systems to analyze the depth of the demo area and accurately identify human forms to separate them from other objects. For example, the program can distinguish between the actual human standing on the floor and the image of the human on the screen because the camera is calibrated for two modes of operation. “One is for the empty stage, so it’s looking for the differences such as the addition of the player, and the other is for isolating a certain depth range, excluding objects outside that range,” says Dubrow. The system can also infer the outline of people’s legs and remove the space in between the limbs, rather than capturing the whole silhouette.
“The program also does all of this processing very quickly, compositing the human into the endgame content in just a fraction of a second,” says Dubrow. “That gives us very sharp segmentation with a low lag time,” she adds.
Moving pieces
The segmentation of the real-time camera feed was just one piece of the broadcasting puzzle. “There’s a lot happening all at the same time during a cast, and so many individual pieces that all have to come together,” says Dubrow. She reels off a list: The headset’s IP address connects to the mixed-reality capture (MRC) server via a purpose-built demo associate (DA) iOS app, the Open Broadcaster Software composites the gameplay and real-time feeds, and a media processor layers the composited feed over an HTML page. Simultaneously, that HTML page is pulling the game title information from the session data and is all ready to animate in sequence behind the Demo Lead-In countdown video that reveals the MRC. And this all happens in just three seconds, says Dubrow.
Getting all these moving pieces moving in unison and developing the systems that needed to interface with one another required collaboration with a few highly specialized companies. “If you were to look at kind of a flat map chart of everyone involved, there were at least five or six other groups we worked with beyond our own team,” says Fruy. “This was a first-of-its-kind demo experience, so it required several pieces of custom software, apps, and integrations.” In addition to Scout House, the team partnered with ForwardXP to develop the DA app that controls the headset as well as the headset app launcher, and with Systems Innovation to develop the demo’s broadcasting tools.
These systems also had to integrate with both our own custom software and third-party systems to enable players to schedule an appointment, set their preferences, and receive a clip of their demo afterward. For example, a feature called Meta Store Connect allows guests to book their demo remotely. They can also link their Facebook account to the demo if they want to share their demo clip for their friends or family to see.