Oculus Rift and Touch are supported to track head and hands of an avatar.
Oculus Rift is supported in InstantVR version 1.0 and higher.
Oculus Touch is supported in InstantVR Advanced and Edge version 3.6 and higher.
Oculus Rift and Touch are supported in Humanoid Control VR, VR+ and Pro.
Oculus Rift DK2 and CV1+Touch are supported.
Oculus Rift & Touch are supported on Microsoft Windows. Ensure PC, Mac & Linux Standalone is selected as platform in Build settings
pre Unity 2017.2: Virtual Reality Supported needs to be enabled in Edit Menu->Project Settings->Player->Other Settings. Oculus needs to be added to the Virtual Reality SDKs.
Unity 2017.2 and higher: Virtual Reality Supported needs to be enabled in Edit Menu->Project Settings->Player->XR Settings. Oculus needs to be added to the Virtual Reality SDKs.
Note: if both Oculus and SteamVR need to be supported in one build, make sure Oculus is listed higher than OpenVR in the Virtual Reality SDKs:
Oculus needs to be enabled in the Edit Menu->Preferences->Instant VR->Oculus Support.
The script IVR_Touch needs to be added to the character GameObject with the InstantVR script.
Oculus needs to be enabled in the Edit Menu->Preferences->Humanoid->Oculus Support.
Disabling Oculus Support ensures that no code related to Oculus is included in the build.
To enable body tracking with Oculus for an avatar, Oculus needs to be enabled in the Humanoid Control component.
First Person Camera needs to be supported for the Oculus Rift. For convenience, this option is also found on the Humanoid Control script.
The Touch Controller needs to be enabled on the Hand Targets for controller support.
Oculus Touch Controller models are shown in the scene when Humanoid Control->Settings->Show Real Objects is enabled. These models can be moved in the scene to place the controllers to the right position relative to the hands of the avatar. A reference to these transforms is found in the Tracker Transform field.
By default, the originposition will be set at (0,0,0) which results in that the Oculus Calibration information will be used. This will be sufficient in most cases.
It is possible to set the origin manually though. The real world position and orientation of the origin can be set using the Tracker Position/Tracker Angles parameters in the IVR_UnityVR script.
When using the InputHandler example script, calibration is initiated when the Tab key on the keyboard is pressed.
The tracking position and orientation can be calibrated during gameplay by calling the Calibrate() function on the HumanoidControl object. This is often implemented using the Controller Input component.