VR Design Research Labs

XR Rig in Unity

The Unity learning platform calls XR Rig users eyes, ears and hands in VR. This is the essence of the XR Rig, and sums up its importance.

Adding it to the VR experience we build is a must.

XR Interaction Toolkit 1, Unity.

Latest developments of the Unity 3d engine means implementing a developer-friendly package called XR Interaction Toolkit. This Toolkit is an interaction system for AR and XR. It helps to set interactable objects, an interactor as well as a locomotion system in Unity projects.

XR Interaction Toolkit 2, Unity.

It is a high-quality, handy system that makes creating VR (and AR) experiences much easier than before. Additionally, Unity provides countless documentation and learning platforms for it, as well as practice tasks.

It is important to set it up properly in the project with all basic elements, and to build on top of that. 

XR Interaction Toolkit 3, Unity.

The experience I created was developed for the Android platform, for Meta Quest 2. 

I used a locomotion system for the player to be able to move around the scene (slowly) and a component called Snap and Turn. The first movement is triggered by using the left-hand controller, and the second – rotation movement is activated by the right-hand controller.

It was important to add XR Origin element and XR Interaction Manager objects in every scene in Unity, and then to add components to them and adjust them according to my needs for this build. 

Locomotion Unity.
Own Project, Unity.
Own Project 2, Unity.

The left-hand and right-hand controllers are the player’s hands in the experience. They are both children of the Camera Offset which is a child of the XR Rig in the Hierarchy. The third child of the Camera Offset elements is the Main Camera, which has a Tracked Pose Driver component added. 

I used the raycast option for my hands controllers, I just had to make a few changes there. The default raycast line had a solid red colour and I did not like that. I created two nice gradients and added them to the XR Interactor Line Visual. I extended the visible line and then tested it out in VR, on the mentioned Quest 2. I had to adjust it several times before setting it at 40 cm. The raycast line turns white when the player collides with objects (different colliders). All the objects that the player can collide with in my project have an XR Grab Interactable component. Thanks to that, objects can be picked up.

Raycast lines, Own Project, Unity.

After all the XR Rig elements were in place, I had a responsive VR environment, where the player can move, turn around and interact with the objects by colliding with or grabbing them.

Leave a Reply

Your email address will not be published. Required fields are marked *