What else? Legs, of course. “I think everyone has been waiting for this,” Zuckerberg joked during today’s keynote. No more floating from the waist up!
It may sound like we’re just flipping a switch behind the scenes, but this took a lot of work to make happen. When your digital body renders incorrectly — in the wrong spot, for instance — it can be distracting or even disturbing, and take you out of the experience immediately. And legs are hard! If your legs are under a desk or even just behind your arms, then the headset can’t see them properly and needs to rely on prediction.
We spent a long time making sure Meta Quest 2 could accurately — and reliably — bring your legs into VR. Legs will roll out to Worlds first, so we can see how it goes. Then we’ll begin bringing legs into more experiences over time as our technology improves.
Next year we’ll also enable developers to start implementing custom Avatar actions and behaviors into their games and apps.
And remember, avatars are still evolving — to be better-looking, more capable and expressive, more customizable. Our next generation of Meta Avatars, previewed today at Connect, will be more expressive and detailed than what’s available today.
Each step brings us a bit closer to photorealistic avatars. Those are still a few years off, but we’re steadily improving both the technology and our understanding of how people want to show up in VR. We’re working on AI that will design an accurate avatar for you, so you don’t need to spend time tinkering with the pieces yourself (unless you want to). In the future, you might also have multiple avatars for different occasions — a serious photorealistic representation of yourself for work meetings, and a more cartoonish version for hanging out. You could even show up to the group hang as a movie character, or a dragon. Who’s to say?
The road to augmented reality
While we’ve made a lot of progress on virtual and mixed reality, two important windows on the metaverse, a lot of work remains to be done in augmented reality, where you see digital objects overlaid perfectly on the world around you. There are still a few years to go before the fundamental technology is advanced enough to support great AR glasses. We’ll need to see progress across the stack — compute, graphics, displays, sensors, AI, basically everything — before the dream of AR glasses is fully realized.
We’re investing in those areas (more on that below), but for glasses, the form factor is critical — all this tech has to be built into something lightweight and comfortable enough to wear casually. So we’re focused on building as much of the AR experience as we can fit into a normal pair of glasses that can blend in with your daily life.
We took our first step forward last year when we partnered with EssilorLuxottica to introduce Ray-Ban Stories, our first-generation smart glasses. In just a year, we’ve introduced big improvements: doubling video capture length from 30 to 60 seconds and making it easier to upload your content from the glasses to Instagram.
Hands-free functionality is a core part of Ray-Ban Stories, letting you stay connected and stay in the moment without having to fumble around for a phone. So soon you’ll also be able to call or text hands-free on Ray-Ban Stories from your phone number.
We also know people love listening to music on Ray-Ban Stories, so we’ll soon start rolling out Spotify Tap playback. You’ll just tap and hold the side of your glasses to play Spotify, and if you want to hear something different, tap and hold again and Spotify will recommend something new.
And we shared an update on Meta Spark. Starting today, we’re giving creators the ability to build interactive 3D objects using Spark Studio and begin testing them in mixed reality through the Meta Spark Player. We’re also building tools that will let creators better understand how gaze and spatial awareness can contribute to 3D content layered on top of the physical world.
Reality Labs Research
Of course, no Connect would be complete without a visit from Reality Labs Chief Scientist Michael Abrash. He joined Zuckerberg to share a behind-the-scenes look at what our research teams have been working on, providing an update on Codec Avatars, Project Aria, and our work to create the first truly human-centric computing interface.
At Reality Labs, we’re inventing a new computing platform — one built around people, connections and the relationships that matter. There’s a lot of ground to cover there, so check out our post on today’s Reality Labs Research news to learn more.
The research and products we showed today are part of a roadmap that extends far into the future — and we can’t build the metaverse alone. We want to bring the best developers, engineers, artists and others together to make this future a reality. Thank you for joining us on this journey. It promises to be a pretty epic ride.