Nov 7 • Abdelrahman Abdelrahman

Create Interactions That Work Cross-Platform

Creating interactions that work seamlessly across different platform is what most developers hope for. This is now possible with the Meta's latest Interaction SDK.

You can now build experiences that work on various VR devices, including Quest devices and other PCVR devices like HTC Vive and Valve Index with just minor differences in the Build settings.

In this Blog, we’ll see how to set up various interactions like Poke, Ray and Grab interaction that work across different platforms.
Write your awesome label here.

Cross-Platform Interactions with Meta's Interaction SDK



Why Cross-Platform Matters

Imagine developing your VR experience once and deploying it across multiple platforms without starting from scratch each time for different platforms. For example, you can develop and app and publish on the Meta Horizon Store, and then with minor changes to your project settings, you can build and publish the on Steam as well.


Understanding the Meta Interaction SDK Packages

The Interaction SDK comes in two packages: the essentials, which support devices using OpenXR, and the Core package for Quest devices. This dual-package approach ensures you can build for a wide range of VR headsets while leveraging device-specific features like passthrough and hand tracking.


1.0 Developing for Standalone Quest Devices 


1.1 Getting Started with Unity and Importing Packages

  1. First things first, let's set up your Unity project. Make sure you're using the latest version of Unity Editor. For this blog, we are working with version 2022.3.40f1 with the 3D Core Render Pipeline.


  2. Once your project is ready, head over to Unity's Asset Store and search for Meta Interaction SDK. Add it to your assets and install it in your project.

  3. While importing you see a pop to restart the editor, restart the editor to add the detected changes.

  4. Once you install it and restart the editor, you will see the latest version of the Interaction SDK is installed, along with the essentials and core SDK

  5. Next, we need to install one more package to manage the plugins, so navigate inside Unity Registry → select XR Plugin ManagementInstall.



1.2 ISDK Package Folder Structure

  1. Meta XR Interaction SDK EssentialsRuntime Prefabs 
    In the Runtime folder, you'll find the Prefab folder for common VR interactions, such as hand models, controllers, and interactive objects. These prefabs simplify the development process by providing ready-to-use components that can be customized to fit your specific needs.
  2. Meta XR Interaction SDK EssentialsRuntime Sample Prefabs 
    If you are looking for the whole interactions like Poke Interaction and Menu or Poke Buttons
  3. Meta XR Interaction SDK EssentialsRuntimeSampleObjects 
    If you are looking for assets or prefabs of room environments or props


1.3 Configuring Project Settings

  1. Navigate to FileBuild Settings → Select AndroidSwitch Platform

  2. Go to Player SettingsXR Plugin Management → Make sure you are on the Android tab → Install Oculus from Plugin Providers

  3. Go to Meta XRFix All the errors → then Apply All the changes



1.4 Setting Up the Scene

  1. Go to Meta XR Interaction SDK EssentialsRuntimeSampleObjects → drag and drop the LargeRoom Prefab into the Hierarchy window. Import TMP Essentials when prompted. 

  2. Then go to Meta XR Interaction SDK EssentialsRuntimeSampleMaterials → drag and drop the SkyboxGradient Material into the Scene.

  3. Then, delete the Main Camera and now there are two ways in which you can add the OVRCameraRig.

    1. Navigate inside  Meta XR Interaction SDK EssentialsRuntimePrefabsOVRCameraRigInteraction

    2. The other way to do this would be to Right Click on the Hierarchy Tab Interaction SDK Add OVR Interaction Rig

    The 2nd method was much simpler right? That’s a feature of the ISDK called “Quick Actions”. We’ll be making use of the quick actions to add various types of interactions later on.

  4. Now the OVR Interaction Rig  has been preconfigured with different type of interactors. So for use to be able to create an interaction, we just need to add the interactables.
    Before that, navigate inside  Meta XR Interaction SDK EssentialsRuntimeSampleObjectsProps → Add each of the following objects which we'll use for the creating various interactions:

    1. Grab Interaction (Mug)
    2. Ray Grab Interaction (Guitar)
    3. Distance Grab Interaction (Key)



1.5 Grab Interaction

Now we are going to add the interaction on the object and for Grab Interaction we are using the Mug

  1. Right Click on the Mug in the Hierarchy tab → Interaction SDKAdd Grab Interaction

  2. The Grab Wizard appears and you can choose different configurations such as, how to interact with the interactors, which grabs types you would like to use and generates colliders if they are missing.

  3. You can also add all the required components like Rigidbody and Target Transform using Fix All 

  4. Nex, click on the Create button and this will create a new GameObject  named ISDK_HandGrabInteraction with all the components required for grabbing.

    Note: To add physics to the object → Select the Object → Box Collider → Uncheck is TriggerRigidBody→ Check Use Gravity → Uncheck Is Kinematic


1.6 Ray Grab Interaction


Now to add Ray Grab Interaction,

  1. Right Click on the Guitar in the Hierarchy tab → Interaction SDKAdd Ray Grab Interaction

    From the Grab Wizard you can choose different configurations once again.

  2. Note that there is an error at the top regarding the collider but if you click on Fix, it will add a box collider which is not suitable for the Guitar’s curves and structure, therefore, we use Select the Guitar → Add ComponentMesh Collider.

  3. When you click on Create , it creates a new GameObject  named ISDK_RayGrabInteraction with all the components and references.


    Differences between Ray Grab Interaction and Distance Grab Interaction:

  • Visual Feedback: Ray grab typically involves a visible ray, whereas distance grab might not show a direct line but uses visual cues like highlighting or outlines.
  • User Experience: Ray grab can feel more like remote controlling, while distance grab provides a more direct, tactile interaction experience, as you can grab the object and once you leave it, it returns back to its original position.
  • Applications: Ray grab is often used for UI and precise selections, while distance grab is favored for immersive and interactive environments.

1.7 Distance Grab Interaction

Now we are going to add the interaction on the object and for Distance Grab Interaction we are using the Key

  1. Right Click on the Key in the Hierarchy tab → Interaction SDKAdd Distance Ray Grab Interaction

  2. The Grab Wizard appears and you can choose different configurations as how to interact with the interactors, which grabs types you would like to use and generates colliders if they are missing. There is also Optional Components like Time Out Snap Zone which snaps the object back to its original position when ungrabbed. 

  3. Click on Create and it creates GameObject named ISDK_DistanceHandGrabInteraction with all the components and references.



1.8 Interacting with UI Elements using UI Ray Interaction

Now we see how to interact with UI elements using Ray Interaction:

  1. First we need to create a canvas → Right Click on the Hierarchy Tab → UICanvasRender Mode = World Space → Reset its position to X=0, Y=1.1, Z=0.8 → Scale it to X=0.001, Y=0.001, Z=1


  2. Next we add a panel to the canvas, Right Click on the Canvas in the Hierarchy Tab → UI Panel , then we change its color slightly darker and increase its Alpha Channel

  3. To add the UI Elements, we go to Packages in the Project Tab → Meta XR Interaction SDK EssentialsRuntime Sample Objects UISet Prefabs Button → Then we choose some buttons, drop down list and slider

  4. To interact with these elements, first we can use UI Ray InteractionRight Click on the Canvas in the Hierarchy tab → Interaction SDKAdd Ray Interaction to Canvas → There is a component missing called PointCanvasModule → Click on Fix to add it to the scene in the EventSystem → click Create

  5. Now when you click on create it creates a new GameObject named ISDK_RayInteraction with all the components and references.



1.9 Interacting with UI Elements using Poke Interaction

  1. To interact with the canvas created above using Poke Interaction, Right Click on the Canvas in the Hierarchy tab → Interaction SDKAdd Poke Interaction to Canvas → click Create.

  2. Click on the create button and this creates a new GameObject  named ISDK_PokeInteraction with all the components and references.

  3. We just need to move the canvas a little bit forward to be in arms-reach for poke interaction, then we save and play the scene



1.10 Building the Scene for Standalone Applications

To build it as a standalone application, we navigate to File Build SettingsAdd Open ScenesBuild and Run → Create a folder called Builds → Name your project → Save


2.0 Developing for PCVR


2.1 Setting Up the Project Settings

  1. Navigate to FileBuild Settings → Select Windows Switch Platform

  2. Go to Player SettingsXR Plugin Management → Make sure you are on the Windows tab → Uncheck Oculus and Check OpenXR from Plugin Providers

  3. When you get an option to enable feature set → Cancel

  4. In the Project Validation Tab, you'll see a caution not to use OpenXR and Oculus XR Plugin at the same time, so to fix that click on Edit → Remove Oculus XR Plugin → Return to the project validation→ FIX ALL


  5. Next, head to OpenXR → Add some of the Interaction Profiles like: 

    1. Oculus Touch Controller Profile
    2. Meta Quest Touch Pro Controller Profile
    3. Valve Index Controller Profile
    4. HTC Vive Controller Profile


2.2 Setting Up the Scene

  1. First, we delete the OVR Camera Rig.
  2. Right Click on the Hierarchy Tab Interaction SDK Add UnityXR Interaction Rig


2.3 Building for PCVR

To build it for PCVR , we navigate to FileBuild SettingsAdd Open ScenesBuild and Run → Create a folder called Builds → Name your project → Save



3.0 Hand Interaction

Hand Interaction can still be done across platforms but since HTC Vive does not support hand interaction we are only doing it on Quest 3 in this project.

  1. Navigate to fileBuild Settings
  2. Go to Player SettingsXR Plugin Management → Check the Meta XR Feature Group
  3. In OpenXR → Add Hand Interaction Profile → Add Feature Groups
    1. Hand Interaction Poses
    2. Hand Tracking Subsystem
    3. Meta Hand Tracking Aim


  1. Head to the scene → UnityXRCameraRigInteractionUnityXRInteractionComprehensive → Disable UnityXRControllerHands
  2. Navigate to File Build SettingsAdd Open ScenesBuild and Run → Create a folder called Builds → Name your project → Save

  1. Open SteamVR → Make sure you are using the latest version → Launch Settings Current OpenXR Runtime = SteamVRMeta Plugin Compatibility = ON  
    Then, open the Steam Link in the Quest → Link it to your laptop → now you can use Hand Interaction in your app when you launch it.


Conclusion

With Meta's Interaction SDK, developing cross-platform VR experiences has never been easier. By following these steps, you can create interactive and immersive content that works seamlessly across various devices. Don't forget to explore the SDK's samples and try building them yourself to deepen your understanding.

Thank you for joining us on this journey to mastering cross-platform interactions. Stay tuned for more tutorials, and happy developing!

Thanks for reading this blog post 🧡

If you've enjoyed the insights shared here, why not spread the word? Share the post with your friends and colleagues who might also find it valuable.
Your support means the world to us and helps us create more content you'll love.