Interaction Pointers and UI

Unity provides a great event system which can be used to interact with a scene. This solution is particularly useful for UI interfaces. Humanoid Control provide extensive and flexible support for the event system using the Interaction module.

The interaction module is supported for gazing/pointing and finger touching. This means that you can trigger events by looking at objects, pointing at them or by touching them. You can determine yourself which body part is used for the gazing/pointing.

Note that you currently can use only one interaction module at the same time!

Interaction Pointer

An interaction pointer can be attached to many items, but it is often used for head gaze and finger pointing interaction. For you convenience, buttons are added to the Head and Hand Targets to add Interaction Pointers.

The Add Interaction Pointer button adds an interaction pointer GameObject as a child of the target.

To remove an interaction pointer, the Interaction Pointer child object of the Target can be removed. After this, it is possible to add a new interaction pointer again with the button.

When an Interaction Pointer is active, you can select objects like UI elements. These objects will receive the Unity EventSystem events PointerEnter, PointerExit, PointerSelect and PointerDeselect. You can use an Event Trigger component to handle these events for generic objects with colliders.

When a click is performed with an interaction pointer, the object will receive the PointerDown, PointerUp and PointerClick events. Besides that, when you move the pointer while clicking, the events BeginDrag, Drag and EndDrag are triggered. These events are used by the UI element to implement clicking on buttons and dragging sliders for example.

A detailed description of the Interaction Pointer component is found here: Interaction Pointer.

Head Gaze for the Head Target

When adding the Interaction Pointer with the button, this interaction pointer will have a sphere as the Focus Point Object which visualizes the point the user is looking at. This sphere will be visible while the Interaction Pointer is active.

This sphere can be replaced by other visualizations if you want, for example a sprite renderer.

Finger Pointing for the Hand Target

For finger pointing, a Line Renderer is used for the Focus Point Object to show where the user it pointing. This line renderer will be visible while the Interaction Pointer is active.

The line renderer can be removed or altered. Additionally further visualizations can be added to the Focus Point Object.

Controller Input

The Add Interaction Button add by default the following controller input actions:

  • Button 1 press: activate Interaction Pointer
  • Trigger 1 press: click with the Interaction Pointer

This can be altered easily with the Controller Input script for the humanoid. For example, the SteamVR Controller does not have a ‘Button one’, so you can assign the touchpad press to activate the Interaction Pointer like this in Humanoid Control v2:

The Interaction Pointer object field should refer to the interaction pointer on the correct hand, in this case the left hand.

UI elements

The interaction Pointer supports interaction with Unity’s UI elements like UI Buttons, but it is necessary to add Colliders and Kinematic Rigidbodies to the UI elements you want to interact with. This is because the Interaction Pointer uses a raycast which does not interact with UI elements.

For example, for a UI button you can add a Box Collider. Make sure the X and Y size match the size of the button itself, the Z size I usually set to 1: