Reality Labs

Playing with the future: Oculus Insight powers beyond-room-scale VR gaming

September 26, 2018

At OC5, we announced Oculus Insight, our inside-out positional tracking technology that powers immersive room-scale experiences on Oculus Quest. With the new Dead and Buried Arena demo, we gave attendees a taste of how Oculus Insight enables arena scale tracking by going beyond room-scale tracking in a massive 4,000 square-foot play area without using any external sensors.

Six players teamed up to participate in a shared game environment through co-location, where players enjoy a shared virtual experience while also sharing physical space.

If you couldn’t make it to OC5, here’s a look at how these technologies worked together:

Arena Scale
At the show, we also announced Oculus Quest, our upcoming all-in-one headset featuring Oculus InsightThis technology works by detecting thousands of points in your environment to compute an accurate position of the headset every millisecond. The end result is a greater sense of immersion, presence, and mobility. For the Dead and Buried Arena demo, we prototyped arena-scale technology on top of Oculus Insight, pushing inside-out tracking beyond room-scale to give you greater mobility and freedom.

We mapped the environment so that real obstacles aligned with their in-game counterparts and ensured each headset localized in the same map. Taking advantage of the four ultra-wide-angle sensors on Oculus Quest, we optimized the arena for accurate tracking at scale. We incorporated design elements on the ceiling, floor, and throughout the experience to make sure Insight could track players anywhere. Since tracking maps to the millimeter, we leveraged these trackable objects as real-world manifestations of the game. When players encountered an object, like a crate, they could physically touch it and use it for cover, adding a layer of strategy to the experience.

Mixed reality 
In the Dead and Buried Arena demo, we showcase an early application of MR to enable self-presence (seeing your body in virtual space) and spatial awareness (seeing your environment while in VR). As players bring the headset over their eyes, MR lets them see their environment and fellow players — no bumping into obstacles or each other — as they gradually “enter” the western-themed virtual world of Dead and Buried.

We leveraged the existing tracking cameras used for Oculus Insight. First, we ran a real-time stereo depth-matching algorithm on each camera image to extract lines and contours of the environment. We then reconstruct these images from the player’s eye position to make everything perceptually comfortable. All these complex algorithms run on-device and in real-time.

Co-Location multiplayer
Today, the only way for multiple players to share VR experiences is online, where each person has their own personal space to move. That’s great for connecting with friends at any distance — but what if your friend is standing in your living room? Maybe you want to play together or watch a movie in shared virtual space. This introduces a unique set of challenges. A shared physical space means individual headsets need to share a common play area and need to be aware of each other’s location in real-time.

Oculus Insight can build and store a “spatial map” of any environment, locally on device. It can retrieve this map and use relevant data to “see” where it’s located. In the demo, we created a master map of the entire space and made the map accessible to multiple devices in the same room over the network. That way, the devices know where they are in relation to one another, allowing them to co-locate each other.

Asymmetric co-location
Our prototype asymmetric co-location technology lets you pick up your mobile device, walk around the play space, and see what your friends are doing in VR. This is a proof-of-concept example of how we could bridge the gap across different platforms and bring people together in a single immersive experience.

We leverage Oculus Insight and the shared spatial maps from each headset to bring a tablet into the VR world as a tracked, but invisible, player. As the participant walks around, they’re treated to a one-to-one mapping of the tablet’s movement in the shared space, so they can see the action in real-time — from their exact vantage point.

Step inside the future
These new technologies are a proof of concept used to demonstrate the potential of headsets like Oculus Quest. We expect these technologies to deliver enhanced VR based social experiences for our location-based entertainment (LBE) developers and the broader developer ecosystem. It’s a future we believe in, and it’s one we can look forward to building alongside the creator and developer community.

It takes a global team of computer vision and software experts across Oculus to build world-class tracking and reconstruction technology. A diverse group of professionals based in Menlo Park, Seattle, Dallas, and Zurich worked to build and integrate these technologies into a compelling VR experience — and we’re just getting started. We look forward to sharing the journey with you!

— The Oculus Team

This post originally appeared on the Oculus blog.

We're hiring AR/VR engineers & researchers!

Help us build the metaverse

Reality Labs

Reality Labs brings together a world-class team of researchers, developers, and engineers to build the future of connection within virtual and augmented reality.