Reality Labs

Three-part harmony: Spark’s new Multi-Class Segmentation tech enables more realistic and complex AR effects

May 27, 2021

Today we’re launching Multi-Class Segmentation in Spark AR Studio, which is a very dry and technical name for a very cool feature. With Multi-Class Segmentation, or MCS, creators will finally be able to layer some of your favorite AR capabilities — changing your background, your hair, your clothes, all at once. The more options, the better, right?

It’s one of our most sophisticated AR updates to-date, and we’re excited to see what you do with it. Keep reading for more details on the underlying MCS tech, how it supercharges creativity on Spark and Instagram, and more.

Recommended Reading

The Tech

We rolled out models for person segmentation in 2019 and followed it up with hair segmentation last year. Of course, you’ve probably never heard of “segmentation” or used that term yourself, but it’s the technical term for what the AI model is doing — recognizing that a group of pixels is a “person” or other class based on extensive training, and then segmenting (or separating) that part from the whole. From there, creators can choose to apply colors, patterns, and textures, or even remove the segment altogether, thus creating a green screen effect. 

If you’ve played around with any effects on Instagram, you’ve probably brushed up against these models, even if you didn’t realize they were powered by AI.

Demonstration of the three MCS segmentation models — person, skin, and hair.

MCS is a major leap forward though. While our previous AR capabilities could recognize and layer effects on a single concept (i.e. “person” or “hair”) MCS can recognize and augment three at once — person, hair, and skin.

It may sound easy, like we simply mashed up three models into one, but the reality is much more complex. Reliably identifying a single segment is hard enough — for instance, making sure that a ponytail draped over a shoulder is seen as “hair” while the tank top strap on the other side is not. MCS is doing that three times, and delineating where hair and skin and clothing overlap so that creators can design effects to make your skin glow like a lightbulb or turn green without having it bleed onto a hat or shirt.

The end result? MCS creates more realistic-looking effects — and it does so (quite literally) faster than the blink of an eye. Eight milliseconds, give or take. We also made sure that MCS runs on a wide range of mobile devices, not just the latest and greatest. And by building the model using PyTorch we kept the size down, so it wouldn’t take up tons of storage space.

Sample effect from @enuriru

MCS isn’t perfect. We partnered with our Inclusive AI team from the start to make sure that we built MCS for a variety of people across different demographics. We also worked with the Inclusive AI team to create a roadmap for future improvements, and are going to continue refining MCS over time. Beards are a sore spot, for instance. You’d call them hair, right? But our model will sometimes get confused and lump them in with skin. Sorry to all you lumberjacks and grizzled sea captains. We’re working on it.

Keep an eye out — we’ll have more to say about this ongoing work soon.

Put on a Show

So what can you do with MCS? To be honest, we’ve only just started playing around with it, and we’re certain you’ll come up with ideas we haven’t even thought of yet.

MCS has tons of potential though, not all of it immediately obvious. For instance: Use “Person” segmentation to isolate your body from the background, and combine that with “Hair” and “Skin” segmentation to isolate everything that’s not-clothing — and voila, you can now modify just the color or fit of your clothes. Try on a new shirt without going to the store, or change both your clothes and hair to become a new character on-the-fly in your own one-person theater production.

...Or fulfill your superhero dreams and become invisible.

Sample effect from @enuriru

A lot of simple effects are still possible via MCS. You could still use it to just change your hair color or background, if you’d like. But the real benefit is the depth of creativity that opens up when you start mixing and matching — adding different effects to your hair, your background, your clothes, and your skin at the same time.  Show off your hair’s hidden sparkle,  and let your arms, legs, face, neck, and even your elbows fire off laser beams like some sort of visual symphony. (We’re looking at you, Reels dancers!)

We’ve noticed a lot of you brushing up against the limits of our current capabilities, as your posts have gotten more and more elaborate and ambitious. Hopefully MCS will help empower those who want deeper and more robust creative tools. 

Ethics and AR

While we think MCS will be an amazing tool for creators, we also know that we must balance expression with the safety of our community — particularly when it comes to skin segmentation. 

Skin segmentation enables creators to fine-tune their makeup, turn invisible, look like a zombie, or even achieve effects that have nothing to do with skin like the clothes and costume changes mentioned above. It’s very versatile, and there are all sorts of benign uses for MCS that we’re excited to see the community explore.

But skin and skin color are sensitive subjects. We use a combination of human and automated systems to review effects as they are submitted for publishing, and will be doing what we can to prevent any use of MCS that violate our Community Standards. We want to ensure that Spark AR and Instagram are positive and welcoming places, and if you think an effect or Story violates those guidelines, report it. All reported effects will be reviewed manually and (if warranted) removed.

With MCS and all future AR capabilities, we will also continue to keep an open dialogue with both a range of experts and the community to ensure we’re meeting our own standards and keeping Facebook company technologies safe and welcoming for all.

More to Come

As mentioned, stay tuned for future updates as we continue refining MCS and making it more reliable for more people in more situations. We’re excited to get MCS in your hands today though, and can’t wait to see what you make with it!