AR Studio lets developers create effects that combine various computer vision algorithms, sensor data, and their own data, to create an experience that comes to life. Here are a few key features of the platform that is in closed beta today:
-
Face Tracker is a real-time computer vision algorithm that tracks the face and allows the creator to make masks that fit and respond to facial movements without writing a line of code.
-
Sensor data is used to allow developers to create effects where people can move their phone to pan around a virtual world.
-
Scripting APIs allow developers to access and download data, respond to user interactions, and modify the effect in real time.
Our earliest AR Studio beta partners, Electronic Arts' Mass Effect: Andromeda, GIPHY, Manchester United, Nike, Real Madrid, TripIt, and Warner Bros' Justice League, have used AR Studio to build camera effects that can help delight and engage their communities in a new way.
Create Effects for Facebook Live
As a part of the AR Studio beta program, new effects are now available in Facebook Live. AR Studio enables developers to design effects that respond directly, in real time, to what's happening in Facebook Live broadcasts, such as how many people are watching and what they are saying in comments. This opens up a whole set of new possibilities for developers and makes Facebook Live broadcasts even more engaging by connecting creative effects to interactions between broadcasters and viewers.
See how these effects work by checking out two we're making available today in Facebook Live: This or That and GIPHY Live. Both effects are powered by AR Studio and are designed to respond in real time to what's happening in a live broadcast.
This or That is an effect created by Facebook to show how broadcasters and viewers can interact live in new ways. Broadcasters get to pick between two options live while viewers can comment using a hashtag to pick the option they think the broadcaster will choose. The effect then surfaces the most popular hashtag!