Some advices and basic scripts to handle events from an Oculus Quest (or Quest 2) controller and use the VR helmet to join a collaborative virtual environment shared thanks to the Photon network engine
In case you need to connect an Oculus Quest 2 to the eduroam network:
Security: WPA et WPA 2 Entreprise
Authentification: TTLS
Hidden identity : anonymous@imta.fr or anonymous@imta.net (according to your status: staff or student)
No certificate
Internal authentification: MSCHAPv2
Username: your IMT Atlantique login
If not yet done, you may need to set the VR helmet to "developer mode" if you need to connect to the helmet through the Oculus App on a mobile phone paired to the VR helmet
First, you need to import the "OpenXR Plugin" and the "Oculus XR Plugin" ("Window / package manager").
Modify your Unity project in order to create an apk that you will deploy an an Oculus Quest VR helmet:
In the "File / Build Settings..." menu, choose the "Android" platfom (it can be quite long)
In the "Project Settings":
XR Plug-in Management:
select the "Initialize XR on Startup" checkBox
Plug-in Providers: choose "OpenXR"
XR Plug-in Management / OpenXR / OpenXR Feature Groups : choose "Meta Quest Support"
Player / Other Settings / Identification / Minimum API Level : choose "Android 6.0 'Marshmallow' (API level 23)"
Setup a test scene to check that the VR helmet tracking works fine:
Create a new folder called "Quest"
In this folder, create a new scene called "QuestTest"
Add 3 "GrabableObject" in this scene (1 red, 1 green, 1 blue)
Add also a "XR Interaction Manager" in the scene
In the "Resources" folder, create a new prefab called "XRNavigationRig" by copying/pasting the "NavigationRig" prefab
Edit this "XRNavigationRig":
Add an "XR Origin (Action-based)"
Remove the "XR Controller" from the "RightHand Controller"
Replace it by the "Samples / XR Interaction Toolkit / 2.5.4 / Starter Assets / Preset / XRI Default Right Controller
Do the same kind of thing for the "LeftHand Controller"
To add visuals of your controllers:
Add the prefabs located in "Samples / XR Interaction Toolkit / 2.5.4 / Starter Assets / Prefabs / Controllers" to Left and Right controllers
Add a "XRNavigationRig" in the "QuestTest" scene
In the "Build Settings", replace the previous "Scenes in Build" by this new "QuestTest" scene
Then deploy the apk (make a "Build And Run") on your VR helmet and test it:
You should be immersed in your virtual environment
You should be able to look around you by turning your head
You should see your controllers moving according to your hands moves
You should be able to navigate in the virtual environment with the joystick of your left controller
You should be able to grab objects with both controllers
To obtain a better interaction than with the XR Ray Interactor:
Create a new ray interactor called "PhotonXRRayInteractor" (that will be used later with Photon):
using UnityEngine;
using UnityEngine.XR.Interaction.Toolkit;
using Photon.Pun;
namespace RVC {
public class PhotonXRRayInteractor : XRRayInteractor {
protected PhotonTool photonTool ;
protected override void OnSelectEntering (SelectEnterEventArgs args) {
base.OnSelectEntering (args) ;
IXRSelectInteractable interactable = args.interactableObject ;
if (!useForceGrab) {
if (TryGetCurrent3DRaycastHit (out var raycastHit)) {
attachTransform.SetPositionAndRotation (
interactable.transform.position +
(gameObject.transform.position - raycastHit.point),
interactable.transform.rotation);
}
} else {
attachTransform.SetPositionAndRotation (
interactable.transform.position,
interactable.transform.rotation) ;
}
}
}
}
Remove the "XR Ray Interactor" from the "RightHand Controller" and replace it by this new script
Add a "PhotonTool" to the "RightHand Controller"
Deploy and test: now the interaction with the grabable objects should be more natural
So now, do the same for the "LeftHand Controller", deploy and test it again
Modify the GameManager of the Arena to create a "XRNavigationRig" when joining the Arena:
In the Unity editor, drag and drop the "XRNavigationRig" onto the "DesktopPrefab" field
Modify the "Build Settings":
Unselect the "QuestTest" scene
Select again the previous "Scenes in Build": "Launcher " and "Arena"
Modify the "Laucher.cs" script:
Add a direct call to the "Connect" method in the "Start" method of "Launcher.cs"
Deploy and test: you should directly enter the shared virtual environment and you should be able to interact with the grabable objects with the Quest 2 controllers
Beware: make sure that you have also added a "XR Interaction Manager" in the "Arena" scene
When several users will join the shared virtual environment with their helmets, the "proxys" of the head and the controllers of the remote users could be driven by the events produced by the local users.
To avoid that, you must make inactive the behaviour of some parts of the interaction tools:
First, to avoid to steal the viewpoint of the other users when joining the shared virtual environment, create a new "XRNavigation.cs" script that you will associate to your "XRNavigationRig" prefab:
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using Photon.Pun;
using UnityEngine.XR.Interaction.Toolkit;
namespace RVC {
public class XRNavigation : DesktopNavigation {
public new void Awake () {
base.Awake () ;
if (! photonView.IsMine) {
Camera theCamera = (Camera)GetComponentInChildren (typeof(Camera)) ;
if (theCamera != null) theCamera.enabled = false ;
}
}
}
}
Second, to avoid to enslave the movements of the other controllers, in the "PhotonXRRayInteractor.cs" script, add this method to desactivate the "ActionBasedController" of this tool:
new void Start () {
base.Start () ;
photonTool = (PhotonTool)GetComponentInChildren (typeof(PhotonTool)) ;
if (! photonTool.photonView.IsMine) {
ActionBasedController abc = (ActionBasedController)GetComponentInChildren (typeof(ActionBasedController)) ;
if (abc != null) abc.enabled = false ;
}
}
Now several users with VR helmets should be able to join the shared virtual environment without any problem.