How to Make a Unity App for Oculus Quest 2 (Part 2: Hand Tracking Whiteboard)
Using Unity Oculus Integration
Welcome to the second part of this series, where we will start developing our hand tracking whiteboard project. By the end, we will have an app that can
- Allow you to easily create, move, and rotate whiteboards in 3D space.
- Lets you align the virtual whiteboard with a real wall, thus making the writing experience more natural.
- Track your hands and use your fingers as a marker, eliminating the need for VR controllers.
If you haven’t seen Part 1, I encourage you go read it, as it covers the necessary set-up to begin this project.
As an optional first step, I highly recommend you to create a plane game object in your Unity project so that you have an idea of where the ground is located. Once that’s done, we need to start working on the two main components of our project: Whiteboard and WhiteboardPen
Whiteboard
Let’s first create a whiteboard prefab by creating a new plane in our scene with the following transform so that our whiteboard appears right in front of us.
Now, build your project and put on your Oculus headset. If the game object does not appear right in front of you, your headset view might be off center. If this is the case, go to Settings and Reset View. Next,
- Create a new layer for whiteboards(which will be later used for ray casting), and assign this layer to your newly created whiteboard game object. In my project, I named this layer Whiteboard and assigned it to Layer 10.
- Rename your plane to “Whiteboard” and create a Whiteboard.cs script under it. Along with that, create a DrawCircle.cs script in your assets.
This script extends the conventional Texture2D class and gives us a quick and easy method to draw circles on a texture.
- Next, open up your Whiteboard.cs script and write/copy & paste the following code:
At every frame, this script checks if the pen is touching the whiteboard. If the pen is touching, the script renders a circle on the board’s texture at coordinates set by the public method SetTouchPosition. These public methods will make more sense once we write our WhiteboardPen.cs script.
Now, unroll OVRCameraRig > RightHandAnchor (or left hand, depending on what your dominant hand is), and create a new C# script named WhiteboardPen.cs under the OVRHandPrefab.
This is the first time we use the OVRHand and OVRSkeleton components given to us by the Oculus Integration. If you would like to learn more about these core building blocks of Oculus hand tracking, you can either check out the script itself or the official documentation, though arguably the most important functionalities of this script — bone positions and pinching gestures — are both used in WhiteboardPen.cs.
Once the WhiteboardPen game object is set up with our new script, build your project and jump into your Unity world. This is what you should see. We’re finally making some progress!
Currently, we have a working whiteboard, but it’s a fixed size, we can’t move or rotate it, and we can’t dynamically create more boards if we need to. To add all of this functionality, we need to:
- Create a Resources folder in Assets, and drag the whiteboard game object into the Resources folder in order to make it a prefab. After doing this, also add a cube game object under the whiteboard, scale and move the cube’s transform like such:
- Create a sphere prefab, color it red, and scale it down to .01f on all axes. This sphere will be used to mark the points at which we will be rotating the plane.
Our goal is to be able to create whiteboards by holding the left hand middle pinch for a few seconds, then drag the pinch horizontally and vertically to resize the board, and finish creating the board by releasing the pinch. Since Whiteboard is now a prefab, we can instantiate it at runtime; however, there comes little challenge with dynamically creating whiteboards. The whiteboard that we instantiate will initially have a fixed initial size, and it will have already created a texture of that size once we release our middle pinch. To address this problem, let’s open up Whiteboard.cs and move the code inside Start() into a new public method we will call Initialize(). We also don’t want to execute the code in Update unless the whiteboard initialized, so
- create a new public bool isActive, (you’ll find out later why it’s public)
- Add isActive = true; to Initialize()
- Add if (!isActive) return; at the beginning of Update()
- Additionally, let’s change the right hand pinky pinch action on WhiteboardPen.cs. Instead of reloading the scene upon receiving the right pinky pinch, let’s reinitialize the current whiteboard by calling the Initialize() method we just added to the class.
- Now, let’s create an empty game object in the scene named WhiteboardUtils and add a new C# script called WhiteboardUtils.cs under it. In that script, we will be using the following code:
Finally, drag the left and right hand OVRPrefabs, the whiteboard prefab, and the sphere prefab to their respective slots.
The scene should now be good to go! Build your project again and put your headset back on.
The benefit of being able to resize and rotate the whiteboard is that you can align your virtual whiteboard with a real wall in your house. This way, the tactile sensation of the wall and the tension reduced by resting your index finger on the wall helps feel the whiteboard experience feel more realistic.
Although this is as far as I have come, the project is nowhere near perfect or complete. Here are some features that can easily be added:
- Erasing
- Deleting whiteboards
- Picking custom marker color
- Adjusting marker width
The next step would be to make this project multiplayer, to set up a way to create rooms and share them, and to implement spacial sound system. I would strongly recommend Normcore.io for building such multiplayer projects on Unity.
Thank you for reading this article. To be honest, when I first started developing this app, I found the Oculus Integration quite challenging. There aren’t many too many helpful tutorials out there, especially on the matter of hand tracking. I also found the official Oculus documentation to be a little terse, so I sincerely hope that this tutorial has helped you understand the Unity VR development process a little bit better.