In March, we began a three-part series exploring the future of human-computer interaction (HCI). First, we laid out our 10-year vision of a contextually-aware, AI-powered interface for augmented reality (AR) glasses that can use the information you choose to share to offer proactive assistance, allowing us to look up and stay present with those around us. Next, we dove into some nearer-term research: wrist-based input combined with usable but limited AI, which dynamically adapts to you and your environment. Today, we conclude the series with a look at our haptic glove research and the advances in soft robotics, microfluidics, hand tracking, haptic rendering, and perceptual science that that work entails.
There is a team deep in Reality Labs (RL) Research tasked with inventing the future of interaction in augmented and virtual reality. They aren’t just looking a couple of years down the road. They’re casting a vision — based on their expertise in highly technical fields — for what our digital worlds will look like in 10 to 15 years. Their job is to then create the future technology people will need for frictionless interaction with those worlds.
The work these researchers, engineers, and designers are doing is long-term research and development. In fact, it’s so novel that they are — in some cases — inventing entirely new domains of scientific research. The resulting technologies have the potential to not only fundamentally alter the course of augmented and virtual reality but to potentially influence fields as diverse as medicine and space travel.
This is the epicenter of the next era of human-computer interaction — a bold research project designed to tackle one of the central challenges of the metaverse: How do we touch the virtual world?
Touching the digital world with haptic gloves
Imagine working on a virtual 3D puzzle with a friend’s ultra-realistic 3D avatar. As you pick up a virtual puzzle piece from the table, your fingers automatically stop moving as you feel it within your grasp. You feel the sharpness of the cardboard’s edges and the smoothness of its surface as you hold it up for closer inspection, followed by a satisfying snap as you fit it into place.
Now imagine sitting down to work at a café and having a virtual screen and keyboard appear in front of you. The virtual keyboard conforms to the size of your hands and the space you have available, and it’s easily personalized to suit your preferences. You can feel the click of each keystroke, as well as the edges of the virtual keys on your fingertips, making it as easy as typing on a perfectly-sized physical keyboard.
How would these experiences enhance your connection to the virtual world? What would they do for your ability to be productive or perform any action in the metaverse?
The closest experience we have to this today is hand tracking on Quest, which lets you see a digital version of your hands in VR and manipulate virtual objects, but without actually feeling them in your hands. While this ability to use your hands directly in VR is a vast improvement over Touch controllers, without haptic feedback, we simply can’t be as dexterous in the virtual world as in the real world. The goal of this research is to change that.