In this blog, you'll learn to create a mini VR game that combines haptic feedback with interactive elements for a truly immersive experience. By the end of this guide, you'll have a functional VR mini-game that you can build for both Quest standalone and PCVR devices.
Want to watch the video instead?
Create a VR Game in 20 Minutes: A Beginner's Journey
In this blog, you'll learn to create a mini VR game that combines haptic feedback with interactive elements for a truly immersive experience. By the end of this guide, you'll have a functional VR mini-game that you can build for both Quest standalone and PCVR devices.
1. Introduction
In the previous blogs, we explored designing haptics using Meta's Haptic Studio and integrating it into Unity with the Haptics SDK. We also discussed creating various types of interactions with the Interaction SDK.
2. Overview
Today, our goal is to combine haptics and interaction to create an immersive VR experience for both Quest Standalones and PCVR devices.
3. Prerequisites
Unity Installation: Ensure you have the latest version of Unity installed with the latest version of Meta XR All-in-One SDK
.
4. Project Settings for Quest Standalone devices
Now to setup the project for Quest Standalone:
-
Navigate to
File
→Build Settings
→ SelectAndroid
and switch the platform. -
Go to
Player Settings
→XR Plugin Management
and selectOculus
as the plugin provider. -
Go to
Meta XR
→ Apply the recommended settings by clicking onFIX ALL
.
5. Developing the Game Scene
Let’s start by download a custom package which has all the assets and script you’ll need to build this VR game.
5.1 Minigame Overview
Before we start setting up our scene, let’s see how the the mini game works
5.1.1 Tutorial
- The user starts of in the tutorial scene, where they are prompted to grab the bow and an arrow.
- Upon grabbing the bow user is prompted to press the trigger to activate the shield.
- The game logic verifies if the user has performed all the three actions of grabbed the bow, pressed the trigger, and grabbed an arrow.
- Upon successful verification, tutorial enemies are activated. They shoot at the user, and move around them.
- The user is then prompted to shoot at the enemies twice to eliminate them.
5.1.2 Main Game:
- After the tutorial, users can access the main game.
- In the main game, enemies spawn in waves, and the objective is to survive as long as possible.
5.2 Scene Setup
Now that you know how the game works, let’s set-up our scene.
- Navigate to
Haptics ISDK Package
→Scenes
and open theMain Scene
-
To add OVR Interaction Rig, right-click on Hierarchy → Navigate to
Interaction ISDK
→ SelectAdd OVR Interaction Rig.
-
Set Up Bow and Arrow Interactions
- Right-click on Bow → Navigate to
Interaction SDK
→ SelectAdd Grab Interaction
- Use
Grab Wizard
to select interactors and grab types and click onCreate
. This will create a Grab Interactable game object with all the components required to enabling grabbing. - Next, create an empty object called BowGrabPoint, which we’ll uses to constrain grab angles. So, set the transformation values as
X=0, Y=0, Z=0.09
- Then, go to
ISDK_HandGrabInteraction
→ DragBowGrabPoint
and drop it in the Grab Source parameter of the Grab Interactable component.
- Right-click on Bow → Navigate to
-
Configure String Interaction
- Right-click on
ArrowAttachPoint
, Navigate toInteraction SDK
→ SelectAdd Grab Interaction
. - This will create once again create a Grab Interactable game object with all the components required to enabling grabbing.
- Now when we grab the string, we want to be able to pull it on the one direction, so add a component called
Grab Free Transformer
to the ISDK_HandGrabInteraction gameobject →Check
all the constraint of thePosition
andRotation
values → Constrain the Z-axis position between min = -0.25 and max = 0. - Scroll all the way up → Drag and drop
Grab Free Transformer
into the One Grab Transformer parameter of theGrabbable
component.
- Right-click on
-
Now to setup the entire bow and arrow interaction, reference the parameters as mentioned below:
- Select the
SpecialBow
prefab,- assign it’s child object ISDK_HandGrabInteraction to the Grabbable and Interactable parameters of the Bow Interaction component.
- from
OVRCameraRig
→OVRCameraRigComprehensive
→OVR Controllers
→ SelectLeftController
and reference it to the controller parameter of the Controller Button Usage Active State component.
- Select the
ArrowAttachPoint
gameobject, assign it’s child object ISDK_HandGrabInteraction to the String Grab Interactable and String Grabbable parameters of the Arrow to Bow Interaction component. - Select the
Arrow
prefab, assign it’s child object ISDK_HandGrabInteraction to the Arrow Grab Interactable and Arrow Grabbable parameters of the Arrow Interaction component.
- Select the
-
Add Player Health Component
-
Navigate to
OVRCameraRigInteraction
→Tracking Space
→ Select CenterEyeAnchor and add thePlayer Health
component. -
Add
Box Collider
→ adjust size and position toX=0.2, Y=1.3, Z=0.2
→ check theIS Trigger
parameter. -
Select
CenterEyeCamera
→ Drop it in Player Health parameter of the in Main Game Controller component of the MainGameController gameobject.
-
-
Apply Changes to Prefabs
- Select the
Arrow
andSpecialBow
prefabs - From the Overrides dropdown Make sure to
Apply all changes
- Select the
-
Testing the Bow and Arrow Interaction
- Navigate to
File
→Build Settings
- Select
Player Settings
→XR Plugin Management
- Ensure
Oculus
is checked in the PC tab - Connect headset using
Link
orAir Link
from the link here - Press
Play
Now you should be able to grab the bow, grab an arrow and bring it closer to interact with both. Upon releasing the grab button, the arrow will get fired with a force proportional to the amount of string pull.
- Navigate to
5.3 Add Haptics for Interactions
Now to add haptics for each of the interaction like the pulling the string, shooting the arrow and activating the shield, we need to write a couple of scripts.
-
First, you can find a script called
Haptics Controller
, this script provides a simple way to reference and access the Controller component, which is used to trigger haptic feedback. -
To use this script, head to
OVRCameraRigInteraction
→OVRCameraRig
→OVRCameraRigComprehensive
→OVR Controllers
→ Go toLeftController
andRightController
→ControllerInteractors
→ for both Controllers, add theHapticsController
Script → Make sure the script choosesLeft
forLeftController
andRight
forRightController
-
Next, you can find the script called BowInteractionHapticsManager. This script,
- triggers specific vibration patterns for actions such as pulling the bowstring, shooting an arrow, and interacting with a shield, which provides feedback for actions like opening, closing, or blocking projectiles.
- listens for events when the bow and string are grabbed or released and adjusts the haptic feedback accordingly.
- it also dynamically updates the string pull feedback based on the strength of the pull and plays corresponding haptic responses using different priority levels for each action.
-
Now add this script to the SpecialBow prefab and to set it up, reference the parameters as mentioned below:
- For the Bow Grab Interactable parameter, drag and drop the ISDK_HandGrabInteraction.
- For the String Grab Interactable parameter, drag and drop the ISDK_HandGrabInteraction of ArrowAttachPoint.
- Assign the respective haptics clips to each of the following:
- String Pull Haptics Clip =
StringPull (Haptic Clip)
- Arrow Shot Haptics Clip =
ShootArrow (Haptic Clip)
- Shield Open Haptics Clip =
ShieldActivated (Haptic Clip)
- Projectile Blocked Clip =
ProjectileBlocked (Haptic Clip)
- String Pull Haptics Clip =
-
Since audio and haptics go hand in hand, we have another script called the Audio Manager. This script has a the same functionality as the bow haptics manager but it plays audio instead of the haptics effect.
-
Add this script to the
SpecialBow
prefab and reference the parameters as follows:-
For the String Grab Interactable parameter, refernce ISDK_HandGrabInteraction of the ArrowAttachPoint gameobject.
-
Create two
AudioSource
, one on theShield
and one on theStringVisual
. Then reference them to the respective Audio Source parameters. -
For the audio clips, assign its respective clip
-
For the Stretch Audio Step parameter→ Choose the last present
-
-
Now finally make sure to
apply all overrides
for this prefab and save the scene.
6. Tutorial Scene Setup
Alright, not it’s time to setup the tutorial scene.
- Select the
Tutorial Scene
and add it to your hierarchy. - Duplicate the
OVRCameraRigInteraction
and move it inside the Tutorial Scene. - Remove the
Main Scene
(no need to save changes). - Navigate to
Arrow
→ISDK_HandGrabInteraction
. - Scroll down and add the
Interactable Unity Event Wrapper
component. - Drag and drop the
ISDK_HandGrabInteraction
gameobject inside theInteractable View
and selectGrab Interactable
. - In the
Interactable Unity Event Wrapper
component, add an item to event “On Selected” → reference theTutorial Game Controller
→ from the dropdown select Tutorial Game Controller and chooseOn Arrow Grabbed
. - Similarly, Navigate to SpecialBow Prefab → ISDK_HandGrabInteraction → Add the
Interactable Unity Event Wrapper
component. - Drag and drop the
ISDK_HandGrabInteraction
gameobject inside theInteractable View
and selectGrab Interactable
. - In the
Interactable Unity Event Wrapper
component, add an item to event “On Selected” → reference theTutorial Game Controller
→ from the dropdown select Tutorial Game Controller and chooseOn Bow Grabbed
.
7. Building Standalone Application
With that we finished developing or game. Now to build and test it
- Navigate to
File
→Build Settings
. - Add the
Tutorial Scene
. - Select the
Main Scene
→ Drag and drop it into the build settings. - Connect your Quest device.
- Click
Build and Run
. - Create a new folder named
Builds
. - Name your file and click
Save
.
Once the build is successfully you can test the game on your headset. While testing the interaction you’ll notice:
- Grabbing the bow and pulling the string activates the haptics and audio.
- Pressing the trigger to open the shield and will activate the haptics and last till the shield is completely activated.
After the game ends, you can choose to Retry
or Quit
.
8. PCVR Project Setup
Now let’s see how build this game for other PCVR devices.
- Navigate to
File
→Build Settings
→ SelectWindows
→Switch Platform
. - Navigate to
Player Settings
→XR Plugin Management
→ SelectOpenXR
as the plugin provider. - If prompted to enable Meta XR feature set, click on
Cancel
. - Navigate to
OpenXR
→ Add interaction profiles:Valve Index Controller
HTC Vive Controller
Oculus Touch Controller
Meta Quest Touch Pro Controller
- Select
Project Validation
→ Scroll down till you find a recommendation to have either OpenXR or Oculus as the plugin provider, click onEdit
→ RemoveOculus XR Plugin
.
9. PCVR Scene Setup
The way the scene is setup for PCVR is slight different but not too much.
9.1 Main Scene Setup
-
Navigate inside
Haptics ISDK Package
→Scenes
→ OpenMain Scene
. -
Select
OVRCameraRigInteraction
→ Delete it. -
Right-click on the hierarchy → navigate inside
Interaction SDK
→ AddUnity XR Interaction Rig
. -
Open child objects → Navigate inside XR Space → Select Main Camera.
-
Add
Player Health
component → AddBox Collider
→ Adjust size and position(Center = 0, 0.6, 0 | Size = 0.2, 1.2, 0.2)
→ CheckIs Trigger
. -
Open child objects of
UnityXRInteractionComprehensive
→ SelectXRControllerHands
→ Disable it. Insted, selectUnityXRControllers
→ Enable it. -
Select controller interactors for both left and right controllers → Add
HapticController
component. -
Adjust the transform of
BowGrabPoint
:X-axis: 270
Y-axis: 0
Z-axis: -90
-
Adjust the transform of
ArrowGrabPoint
:- Rotate
Y-axis by 180
- Rotate
-
Select both prefabs and make sure to
Override changes.
9.2 Tutorial Scene Setup
- Drag and drop the
Tutorial Scene
→ DuplicateUnityXRCameraRigInteraction
from the Main Scene → Move it inside theTutorial Scene
. - Remove the
OVRCameraRigInteraction
from theTutorial Scene
→ Save both scenes.
10. Building PCVR Application
-
Navigate to
File
→Build Settings
→ ClickBuild.
-
Create a new folder called
PC Builds
→ Select this folder as the build location.
After the build is successful you can test it on any PC VR devices. In this case the game was tested on a HTC Vive using SteamVR.
While testing you’ll notice that the haptics works perfectly fine and it syncs with the audio as well. However since HTC vive has a different type of vibration actuator it’s frequency cannot be altered, so you’ll notice a small difference when compared to the Quest standalone devices like Quest3 or Quest Pro.
Conclusion
Now that you have learned to use haptics and interaction together, you're equipped to create immersive VR experiences. The next steps for your project could include implementing player health systems, developing logic for tracking how many waves users survive, or adding enemy explosions to create controller vibrations. If you create something exciting, we invite you to share it with our community on Discord.
Thank you for following along with this tutorial. Be sure to like and subscribe for more content, and I look forward to seeing you in the next one. Happy developing!