What is new in Humanoid Control 3

New Events and Scripts

It is more easier then before to use Humanoid Control without programming. With the new event handlers you can call functions based on events happening. These include events from:

When you want to execute more than one function from an event you can use the new Scripts. These will also enable you to set additional conditions on the execution of the script.

Hand Pose Detection and Input

Control your application by using hand poses. You can define your own Hand Poses and have Humanoid Control score the current hand pose. You can then execute functions based on the recognized poses using the new event handlers.

Pinch interaction

With the growing support for hand tracking we can support more interaction options. In addition to the full hand grabbing Humanoid Control v3 now also support pinch grabbing for the more fine interactions.

Sockets

We already knew the Handles which you can grab with your hands. Now we have sockets to which you can attach handles in a specific way. Read about it on the Sockets page.

Mechanical Joints

Physics joints are often too advanced to do simple joints. With the mechanical Joint you get an easy way to connect manipulatable Rigidbodies to other objects. Read all about it on the Mechanical Joint page.

Valve Index / SteamVR Knuckles support

The new controllers from Valve support skeletal input option which provides full hand pose in addition to the traditional controller buttons input. With Humanoid Control v3 you can use this skeleton directly in your projects.

Mirror and Photon Bolt networking support

Humanoid Control has a more generic approach to networking. This enables support for new networking solutions. In version 3 we have added support for Mirror Networking and Photon Bolt.

Unity XR support

Starting with Unity 2019.3, Unity is moving toward the new Unity XR package management for VR. Humanoid Control provides built-in support for this new approach. Read more about it on the Unity XR page.

Humanoid Control Plus

Oculus Quest Hand Tracking

Using the latest Oculus Integration package you can now use hand tracking on the Oculus Quest with just one setting. You can find more information on this on the Oculus Quest page.

Vive Hand Tracking support

For the Vive, HTC also provides hand tracking using the built-in cameras. When using the HTC Hand Tracking SDK, using this option is as easy as it can get. Read more about this on the Vive Hand Tracking page.

Azure Kinect support

With the Azure Kinect a new level in body tracking is reached. It is no longer necessary to face the camera which improves the possibilities for body tracking. Information on the setup is found on the Azure Kinect page.

Humanoid Control Pro

Apple ARKit face tracking

The Applie iPhone provides a great face tracking option. You can now use it directly in Humanoid Control for facial expressions using the ARKit extension in Humanoid Control Pro.

Antilatency support

For location based experiences Antilatency provides a great solution for small to large scale multiplayer tracking. We can now support full body tracking with the new Antilatency support.

Noitom Hi5 Glove support

The Hi5 Glove provides additional hand tracking for body tracking systems like Vive Trackers and Antilatency. Read more about it on the Noitom Hi5 Glove support page.

Eye Behaviour

In Humanoid Control Pro, the eyes of an avatar can respond interactively to their environment. They will automatically focus on the most important object of person in the neighbourhood.