As you might know already, there are 3 types of extended technologies: AR, VR, and MR. Looking at the accessibility of these technologies right now, AR is the most accessible as it works with our smartphones and almost everyone has it. It is affordable as well. The VR tech is not as easily accessible as AR but compared to MR tech it is. Meta’s Quest 2 is currently the best affordable headset and I believe most of the VR developers have it. Now coming to MR tech, it's the least accessible one. The HoloLens, the Snapchat Spectacles, and other devices in the market are very expensive and it’s not easy to get hold of one as well. But personally, I feel MR has more application than the other two. So at this point in time what's the best alternative we have to develop and learn MR? I would say the Meta Passthrough API. Yes! It does mean you need to have a Quest 2 device with you to learn and develop this API.
The API takes the camera feed and processes it through a filter called a passthrough. So that output you see on the display is not the actual environment as it but a processed one. We will not have access to the actual images or videos of a user’s physical environment. Since the output is rendered from a filter, we will have some flexibility to tweak the filter to get a certain output. As of now, there are only 3 ways.
First is Styling, which lets us colorize the passthrough feed, highlight edges, and perform image processing effects such as contrast adjustment and posterizations of the image, stair-stepping the grayscale values.
The second is Composite layering, which lets you specify the placement of the passthrough layer relative to virtual content (overlay, underlay) and how it should blend with the virtual content. Using alpha masking, you can specify where on the screen Passthrough shows up.
And the last one is Surface-Projected Passthrough, which lets us define the geometry onto which Passthrough images are projected. This approach leads to a more stable rendition of Passthrough in cases where parts of the user’s environment are known to the application.
Before we can start customizing the passthrough filter, we need to set up our Unity project for passthrough. So in this blog, we’ll be focusing on all the setup that is required to make passthrough work on your device.
Before we begin with the setup and implementation, make sure to have the following requirements met:
You can either connect your machine with Quest2 via Link Cable or via Airlink, the choice is yours. Once you have the Quest2 connected:
💡 Note: If you already had the Unity project open while you made changes to the above settings then restart it.
The Oculus Integration package comes with a few example scenes and resources, those scenes are set up for 3D Render Pipeline. So while choosing the render pipeline for your project, you can stick with 3D render pipeline if you want to test the example scene or if you want to use the resources, or else you can use URP as well.
Wait for the Unity project to reopen, with that we have successfully imported the Oculus Integration package with the latest Oculus plugins.
This section tells you about the custom settings that are required to run Passthrough API on your Quest2 when you test it or build it.
Now with these settings, we’ll be able to test Passthrough via the Unity editor as well as Build it to Quest2 once the scene is ready.
To see if we can use passthrough via the link cable:
💡 Note: Currently, there is no way to capture the output when you are testing the Passthrough API via link. However, it's possible to capture by building the APK and streaming the output using SIDEQUEST.
We have learned how to set up the Oculus PC app, download and install the Oculus Integration package and set up the Unity project such that we can test passthrough via Oculus Link. The set-up is also such that we can easily build the APK into Quest2 as well.
In the next part, we’ll learn how to create a simple scene from scratch to implement a passthrough.
If you've enjoyed the insights shared here, why not spread the word? Share the post with your friends and colleagues who might also find it valuable.
Your support means the world to us and helps us create more content you'll love.