From the mainframe to the mobile phone, the evolution of computing has trended toward ubiquity — what was once accessible for a select few is now in the palm of billions of people’s hands across the globe. At Facebook Reality Labs (FRL), we’re building the next great computing platform, and we’re putting people at the center from the start.
Facebook Reality Labs’ principles for building the future
Virtual and augmented reality will completely revolutionize how we work and collaborate with other people. Rather than simply working remotely, colleagues on distributed teams can come together in a shared virtual space, benefiting from the serendipity and community engendered by face-to-face interactions — despite the limitations of physical distance.
And the AR glasses of the future will package it all in a convenient, all-day wearable form factor, liberating us from the confines of a desktop or hand-held screen. They’ll empower us, giving us access to near-instantaneous contextualized information to help us better navigate the world. They might improve our hearing as well as our vision. They’ll deliver social presence that’s far richer than what’s possible with a VR headset today. They’ll help us connect with the people who matter most to us. And they’ll give us what today we might call “superpowers,” but in the future will be simply: empowering.
We’re talking about building the future. But many elements of the technology needed to build this future simply don’t exist yet — nor do expectations around their responsible development and use. As we build the next computing platform centered around people, we’re committed to driving this innovation forward in a responsible, privacy-centric way. That’s why we’ve crafted a set of principles for responsible innovation that guide all our work in the lab and help ensure we build products that are designed with privacy, safety, and security at the forefront.
Never surprise people
We’re transparent about how our products work, the data they collect, and how that data is used over time so that people know what to expect.
Provide controls that matter
We build simple controls that are easy to understand, and we’re clear about the implications involved in people’s choices.
We build for people of all backgrounds, including people who aren’t using our products but may be affected by them.
Put people first
We strive to do what’s right for our community, individuals, and our business. When faced with tradeoffs, we prioritize what’s best for our community.
None of this is new. These issues have been top of mind for years. Our goal with these principles is to further operationalize our approach to building responsibly. We’ve been sharing and seeking input on these principles with many experts across the privacy, safety, and AR/VR community like Kavya Pearlman, Founder and CEO of XR Safety Initiative, and Jeremy Greenberg, Policy Counsel at the Future of Privacy Forum. Consultation is already a crucial aspect of our product development process, and going forward these principles will help inform how we engage with experts through an updated approach and expanded initiatives with a more diverse audience. We will also continue working with experts on the specific policy questions related to these principles.
As part of this, we’re announcing two requests for proposals for $1M USD to conduct research focused on the impact of AR, VR, and smart device technology on non-users — especially those from under-represented communities, as well as best practices for fostering welcoming and inclusive environments in 3D spaces. This comes on the heels of another RFP we launched earlier this year around Trust in AR, VR, and Smart Devices and are pleased to share the recipients today.
These principles are only meaningful if they are actionable. We must apply them at every stage of the product lifecycle, from research to development to launch and beyond. That’s why our privacy and trust team is collaborating with experts from across Facebook Reality Labs and the company — including our dedicated Responsible Innovation team — to examine our existing processes, identify gaps, and ensure consistency in the application of these principles to how we make decisions and build products.
That might mean updating existing processes. For example, privacy is already baked into our teams’ planning and prioritization; teams are required to include privacy goals as part of their planning process. We will be expanding this to reflect our principles going forward. Or it may mean creating entirely new frameworks, like the model we’re developing to help us make decisions around how we build signals and indicators for our products that take non-users into account.
This is just a starting point: Our principles of responsible innovation will continue to evolve in lockstep with not only the technology we’re building but with people’s expectations as well.
The promise of this technology is exciting — but it’s critical that we design and build responsibly from the start. In this time of physical distancing, the work we do to bring people together feels more pressing than ever. And we’re committed to getting it right.