Oculus Rift/Touch

Oculus Rift and Touch are supported to track head and hands of an avatar.

Prerequisites

InstantVR

Oculus Rift is supported in InstantVR version 1.0 and higher.

Oculus Touch is supported in InstantVR Advanced and Edge version 3.6 and higher.

Humanoid Control

Oculus Rift and Touch are supported in Humanoid Control VR, Plus and Pro.

Hardware

Oculus Rift DK2, Rift (S)+Touch and Quest (via Oculus Link) are supported.

Operating System

Oculus Rift & Touch are supported on Microsoft Windows. Ensure PC, Mac & Linux Standalone is selected as platform in Build settings

Setup

Pre Unity 2017.2: Virtual Reality Supported needs to be enabled in Edit Menu->Project Settings->Player->Other Settings. Oculus needs to be added to the Virtual Reality SDKs.

Unity 2017.2 and higher: Virtual Reality Supported needs to be enabled in Edit Menu->Project Settings->Player->XR Settings. Oculus needs to be added to the Virtual Reality SDKs.

Note: if both Oculus and OpenVR need to be supported in one build, make sure Oculus is listed higher than OpenVR in the Virtual Reality SDKs:

InstantVR

Oculus needs to be enabled in the Edit Menu->Preferences->Instant VR->Oculus Support.

The script IVR_Touch needs to be added to the character GameObject with the InstantVR script.

Humanoid Control

Oculus needs to be enabled in the Edit Menu->Preferences->Humanoid->Oculus Support.

Disabling Oculus Support ensures that no code related to Oculus is included in the build.

Targets

To enable body tracking with Oculus for an avatar, Oculus needs to be enabled in the Humanoid Control component.

Head Target

To enable head tracking with Oculus for an avatar, Oculus HMD needs to be enabled in the Humanoid Control component.
The Tracker Transform is a reference to the HMD itself in the Real World pose.

You can disable the First Person Camera to get a head pose for the avatar without the camera being on the head of the camera. This can be useful if you want to create a third-person view of the user for example.

Hand Target

The Touch Controller needs to be enabled on the Hand Targets for controller support.

Oculus Touch Controller models are shown in the scene when Humanoid Control->Settings->Show Real Objects is enabled. These models can be moved in the scene to place the controllers to the right position relative to the hands of the avatar. A reference to these transforms is found in the Tracker Transform field.

Controller Input

The buttons of the Oculus Touch controller can be accessed using the Game Controller Input. The buttons are mapped as follows:

Left Controller

Y buttoncontroller.left.button[0]
X buttoncontroller.left.button[1]
Thumbstick touchcontroller.left.stickTouch
Thumbstick presscontroller.left.stickButton
Thumbstick movementscontroller.left.stickHorizontal .stickVertical
Index triggercontroller.left.trigger1
Hand triggercontroller.left.trigger2

Right Controller

A buttoncontroller.right.button[0]
B buttoncontroller.right.button[1]
Thumbstick touchcontroller.right.stickTouch
Thumbstick presscontroller.right.stickButton
Thumbstick movementscontroller.right.stickHorizontal .stickVertical
Index triggercontroller.right.trigger1
Hand triggercontroller.right.trigger2

Calibration

InstantVR

By default, the origin position will be set at (0,0,0) which results in that the Oculus Calibration information will be used. This will be sufficient in most cases.

It is possible to set the origin manually though. The real world position and orientation of the origin can  be set using the Tracker Position/Tracker Angles parameters in the IVR_UnityVR script.

When using the InputHandler example script, calibration is initiated when the Tab key on the keyboard is pressed.

Humanoid Control

The tracking position and orientation can be calibrated during gameplay by calling the Calibrate() function on the HumanoidControl object. This is often implemented using the Controller Input component.