In this video from our Redmond research lab, we can see a model of how the digital assistant in future AR glasses might not only perceive the objects in a room around you, but also anticipate what you might want to do with those objects, like turn on a TV or a lamp. It could even help you find your missing keys or keep an inventory of your fridge. Here, our researcher communicates with the AI system using a handheld clicking device, but one day we think wrist-worn EMG input devices will serve this purpose. Our research team is also exploring voice interfaces.
This future is still a long way ahead of us, but our research efforts are getting us closer.
Data collection in the real world: Principles first
In order to keep building the future of AR, we’re going to need a greater variety of first-person data from real-life situations that we can use to train digital assistants.
In order to gather this data, we start by keeping in mind our Responsible Innovation Principles. In addition to “never surprise people” and “provide controls that matter,” we make a point to “consider everyone” — even people who only come into contact with people using our research devices and products. And we “put people first,” prioritizing what’s best for communities while being a responsible steward of data.
As we noted when Project Aria was first announced, we placed restrictions on when and where people could gather data. For example, recording is never permitted in sensitive areas like restrooms, prayer rooms, locker rooms, or in sensitive meetings and other private situations — and it is only allowed in the homes of wearers with consent from all members of the household. All devices also display a prominent white light that indicates when they’re in use.
Once uploaded, captured data is kept under quarantine (not available to researchers) for three days. During that time, for any data set gathered in a public place, the system automatically blurs faces and license plates. Additionally, anyone can request that relevant segments of recorded data be deleted immediately or within the quarantine period, without the need for anyone to see the raw content.
So that AR glasses will work in different settings and cultural contexts, we announced last month that we expanded initial in-public use from the US to Singapore. In addition, some Facebook contractors and employees based in the UK, EU, Switzerland, Singapore, Canada and Israel captured data in their own homes this year, with agreement from all members of their household.
In keeping with our commitment to diversity, equity, and inclusion, we need to capture a diverse data set from people of varying and diverse genders, heights, body types, ethnicities, and abilities. Starting in November, we’ll open up participation in the countries where we’re already collecting data to a wider group of Facebook employees and contractors, as well as external, paid research participants, bringing the total number of devices in use to about 3,000.
Of course, every participant will be required to follow the full set of rules and training before using the device, and when gathering data in public spaces, they must wear clothing and a lanyard that identifies them as research participants in Project Aria. Participants also won’t be able to directly view or listen to the raw data captured by the device.
Making first-person data available to the research community
Building contextually-aware egocentric (first-person) AI is a decades-long journey. The academic field of artificial intelligence has taken massive strides in the past two decades, but despite all that progress, AI has yet to really understand how we see the world.
That’s why Facebook AI recently announced Ego4D — a long-term project to make egocentric data publicly available to the research community, along with a set of benchmark challenges centered on first-person visual experiences for the AI assistants of the future.
Facebook Reality Labs is also expanding the Project Aria pilot program, which we started last year with Carnegie Mellon University, by making glasses available to the National University of Singapore, as well as other university partners in the future. This will help advance those institutions’ own egocentric perception research. While FRL Research will fund the work at CMU and NUS through sponsored contracts, the universities are not required to share the data they collect with us.
Each university we partner with will be responsible for complying with standards from institutional research ethics committees or review boards, as well as our Project Aria Research Community Guidelines. These requirements and best practices mirror Facebook’s own Project Aria privacy requirements (e.g. ensuring it is clear to bystanders that recording is taking place and blurring personally identifiable information such as faces and license plates).
New partnership with BMW
Finally, in addition to our growing academic partnerships, we’re also proud to announce our first industry partnership with BMW. We think that AR glasses could eventually help drivers navigate their surroundings. Before we can get there, partners like BMW are interested in exploring how AR technology could integrate into tomorrow’s vehicles.
This partnership will also help our researchers explore how AR glasses, which will rely on visual cues to identify their location, can situate themselves in a moving car.
As with our academic partners, BMW and any future industry partners are required to abide by our Project Aria Research Community Guidelines.
Academic and industrial research institutions interested in participating in Project Aria can submit their proposals here.
Thinking through social responsibility
As we look towards expanding Project Aria in the future, it’s our duty to gather and use data responsibly. So for example, last month in Singapore we hosted a Design Jam between Facebook researchers and 28 external privacy experts and academics from the National University of Singapore and Nanyang Technical University, from industry groups like the Singapore Chapter of the VR/AR Association, and from startups like Element XR, to walk through the real-world implications of Project Aria data collection.
More broadly, we’ll be working with an ethicist to help us think through more of the privacy implications and social acceptability issues related to Project Aria.
All this is just a start. We still have a long way to go before AR glasses are fully realized. We’re working with academics, engineers, other companies, and privacy experts to ensure we understand both the technology and the impact it will have on the people who use it — and, of equal importance, the people who don’t. These measures help make sure everyone’s privacy and safety are protected.
There are many more issues to explore, so continue to check out the Project Aria website for more information.