Passthrough camera access is stirring up quite the debate in the XR world, especially now that we know where Meta, Apple, and Pico stand. Everyone’s curious about Google’s plans for Android XR. After a direct chat with them, I can confirm they’ll roll out a solution similar to the phone’s approach. Stick around for more insights!
Understanding the Camera Access Dilemma
Let’s rewind a bit for those who might be wondering what this whole passthrough camera access fuss is about. The latest VR headsets are essentially MR headsets, equipped to display an RGB passthrough view of the real world using their front cameras. This capability powers delightful mixed reality applications like Cubism, Starship Home, and Pencil.
These headsets’ operating systems use camera-captured frames to deliver this passthrough view. Developers, like us, are eager to tap into these camera frames too. Why? To use AI and computer vision to provide mind-blowing enhancements to the user’s world. I’m a staunch advocate that such access is the gateway to unlocking true mixed reality. Without it, applications fail to fully understand and interact with their surroundings. For example, with some clever camera-access tricks on Quest, I developed a prototype AI+MR app that assists with interior design. It wouldn’t have been feasible without such access.
Sounds exciting, doesn’t it? But here’s the rub: privacy concerns. If rogue developers get access, they could exploit the camera to secretly capture images of a user’s environment, using AI to glean sensitive info—like ID documents or credit cards lying around. Not to forget the potential misuse of capturing faces or body images.
It’s a precarious balance—ensuring user privacy while unleashing mixed reality’s full potential.
How XR Companies are Tackling It
In the early days, access to camera data was freely granted, no strings attached. Those who’ve been following me might recall our camera texture experiments on the Vive Focus back in 2019—notable projects like diminished reality, Aruco marker tracking, and sound-reactive applications.
As mixed reality trends took off, these companies got cautious, restricting camera access to tackle privacy fears. Meta, Pico, HTC, and Apple all closed this door to developers.
That became the norm until XR developers pushed back, advocating for this critical feature. Champions of this cause like Cix Liv, Michael Gschwandtner, and myself demanded clear, user-transparent camera access. Our argument? Phones, the ubiquitous devices in our pockets, allow camera access with a mere permission request, so why not XR devices?
Eventually, XR firms began to relent, with Meta hinting at a “Passthrough API” launch at the year’s start. But what about Google’s take with Android XR?
Android XR: Eyeing the Phone Model
Globally, most phones run on Android. When developing Android apps, one can easily request camera access with user consent. If given, you simply specify which camera (like ID 0 for the back camera), and you’re good to go.
Google is aligning Android XR to be just as compatible with existing Android apps. Though rumors swirled for a while, I got the scoop from a Google spokesperson via email. Here’s what she said about camera access on Android XR:
Just like Android apps, XR developers can access camera frames with user permission. The system supports this through standard Android Camera APIs, like Camera2 and CameraX. For the world-facing camera feed, analogous to a smartphone’s back camera, the app must request appropriate permissions.
When it comes to selfie-camera access, developers get an image stream of the user’s avatar. This feed arises from avatar-providing apps leveraging OpenXR API data, tracking head, hands, eyes, and facial expressions.
Thus, standard Android classes like CameraX can also manage camera streams on Android XR headsets. As these classes enable frame capturing, media saving, and running analysis, developers can potentially transfer these functionalities to headsets too. This is indeed a promising development!
While the “selfie camera” yields a reconstructed avatar, akin to Apple Vision Pro’s approach, it ensures Android XR stays consistent with Android’s phone behavior. An app accessing the “rear camera” views the user’s scene, while the “selfie camera” sees the avatar face—built from the tracked user data.
Looking Ahead with Android XR
Google wants existing Android apps to seamlessly work on Android XR, and these choices ensure camera-access features follow suit. It’s a smart move, aligning permission requests across phones and headsets.
You might still have a few questions. One probable concern: access to all raw camera streams. Sadly, the news isn’t optimistic here:
Currently, no route enables applications to access non-standard sensor data streams.
This implies streams beyond the standard front or back camera are off-limits for now. Maybe, with time, enterprise-level access might evolve.
For Unity developers worrying about Android Camera2 and CameraX classes—native Android functions—don’t sweat it. Android XR’s likely adoption of Android’s camera handling can let Unity folks use the WebcamTexture class to access frames. If not, a native CameraX wrapper library, connecting Unity to native methods via JNI, could serve as a workaround.
A Note on Android XR
At present, Android XR is still in its preview phase; no headsets running it have hit the market yet. Thus, all the mentioned info could potentially change before the official release. I doubt it, but it’s wise to stay informed about possible shifts.
Towards Expanding Camera Access
With Google and Meta opening up, others will likely follow. It’s possible 2025 will become a landmark year for exploring new mixed reality dimensions. I’m eager to witness the innovative wonders our community of developers will craft!