How to Make a Unity App for Oculus Quest 2 (Part 1: Project Setup)

Using Unity Oculus Integration

Alper Canberk
5 min readJul 14, 2021

As bustling office environments turned into dull Zoom calls with the start of the pandemic, many people have lost their ability to share ideas and collaborate with their co-workers as freely as they did pre-pandemic. Although Virtual Reality offers a variety of immersive online collaboration platforms to address these issues, none seem to have a decent whiteboard feature that can replace the real experience.

This is why I ventured out to create a Unity VR application that can:

  • Allow you to easily create, move, and rotate whiteboards in 3D space.
  • Let you attach the virtual whiteboard to a real wall, thus making the writing experience more natural.
  • Track your hands and use your fingers as a marker, eliminating the need for VR controllers.
Write on the surface using your index finger and use your pinky pinch to clear the board.

Although I successfully implemented the features above, I soon realized that the precision of the Quest 2 hand tracking is not quite good enough to make the whiteboard experience just right. So instead of going forward with the project and making it multiplayer, I decided to write this tutorial so that I can make the Unity Oculus Integration learning curve a bit more tolerable for beginners like me.

P.S. I claim to be no expert at any of these topics, so feel free to make any suggestions. If you would like to contribute to the project or download the code to test it out yourself, here’s the GitHub link.

*This tutorial assumes basic knowledge of Unity Game Engine. If that’s not the case, they have lots of great tutorials on their own website.

Project Setup

The official documentation for Unity-Oculus Integration gives very clear instructions on how to set up your Unity project for Oculus development. I strongly recommend that you check it out to sort out some details. Though here’s a summary of the important steps you have to take in order to get started quickly. Warning: Some of these steps take a very long time to download/set up, so grab a snack and prepare to wait in front of your screen.

Oculus Quest Device Setup for Development

  1. Connect your headset to the Oculus companion app for iOS or Android if you haven’t already.
  2. On the app, go to Settings, tap your device and go to More Settings > Developer Mode, and toggle Developer Mode.
  3. Connect your Oculus Quest to your computer using a USB-C cable and put on your headset.
  4. You will be prompted with the following menu. Accept Allow USB Debugging and Always allow from this computer.

Unity Setup

  1. Install a version of Unity (2019.x or higher) that has Android Build Support

2. Create a new Unity 3D project using this version of Unity.

3. Import the Oculus Integration asset from Unity Asset Store. This part takes a while to import. If you are prompted with questions related to updating some plugins, just press accept for all of them.

4. Go to your Unity project’s Build Settings, and select Android as your platform. Select ASTC for texture compression.

5. If you have set up your Oculus device correctly in the previous step, you should see your headset listed under Run Device. If you don’t see your device, put on your headset again and see if there are any prompts you need to accept. Select your device from that menu. Finally, click on Switch Platform. Keep in mind that this step also takes a while complete.

6. Go to Edit > Project Settings, and you should see a tab named XR Plugin Management at the bottom of the panel. Click on the tab and then press Install XR Plugin Management. Then, press the Oculus button under the Android tab.

Hello World

It’s time to hop onto your virtual world! In your Unity Scene,

  • Delete the Main Camera under Hierarchy.
  • Search up the OVRCameraRig prefab in your assets. Once you find it, drag and drop it into the project hierarchy.
  • Now, under the OVRManager component of your OVRCameraRig located in the scene, set Tracking Origin Type to Floor Level instead of Eye Level.
  • We want to start interacting with the world using our hands soon, so enable hand tracking support by changing Controllers Only to Hands Only.
  • Now that you have set up the necessary cameras, create a few game objects in your world (a plane, cubes, spheres etc.), make sure your Oculus Quest is plugged into your computer, and press Command+B (or Build & Run from Project Settings).
  • Put on your headset, accept any permissions/prompts and you should spawn in your own Unity world.

Hand Tracking

While you’re inside your Unity VR project, you may try to raise your hands and see them, but we’re not there just yet. Although hand tracking is enabled, we haven’t yet added the necessary hand game objects.

  • Much like how we found OVRCameraRig, search up OVRHandPrefab in your assets.
  • Since we’re going to be changing the camera rig, right click on the OVRCameraRig prefab > Prefab > Unpack
  • Under the project hierarchy, unroll OVRCameraRig > TrackingSpace. You should see a right and a left hand anchor. Drag the OVRHandPrefab like shown down below.
  • The default OVRHandPrefab is set up for a left hand, so click on the OVRHandPrefab you placed under RightHandAnchor, and edit the OVRHand, OVRSkeleton, and OVRMesh components under the prefab like so,

Build and run your project again, put on your headset, and look at your hands. There they are!

Now that all the tools are set, it’s time to start developing the Whiteboard app, which I will be covering in Part 2 of this series.

Hope you enjoyed!

--

--

Responses (1)