Apple ARKit

With the Apple ARkit it is possible to do accurate face tracking in combination with Humanoid Control

Prerequisites

Apple ARKit face tracking is supported in the Humanoid Control Pro edition.

Unity

ARKit face tracking requires Unity 2019.1 or higher.

Hardware

Apple iOS hardware compatible with ARKit face tracking is required.

Operating System

ARKit face tracking requires iOS and MacOS for development.

Setup

For ARKit face tracking to work, a number of packages need to be enabled in Unity

In the Unity Package Manager install the following packages:

  • AR Foundation
  • ARKit Face Tracking

Then the ARKit extension will be enabled automatically:

Configuration

To enable ARKit face tracking for an avatar ARKit needs to be enabled in the Humanoid Control component.

The Tracker Transform will be a reference to the Transform in the scene representing the ARKit tracker at runtime.

Head Target

On the Head Target component it is possible to enable or disable the Head and/or Face Tracking for the avatar:

Facial Bones Configuration

For the face tracking to work well you may need to configure the face in the Head Target. The best results are currently achieved with facial bones because they have a more clear definition for how the face will move when they are adjusted.

We try to detect facial bones automatically, but it is good to check and adjust these bones where necessary. The facial bones can be found in the Configuration section of the Head Target:

The following bones are used for facial tracking with ARKit:

  • Left Eye Brow, Outer Bone
  • Left Eye Brow, Center Bone
  • Left Eye Bron, Inner Bone
  • Right Eye Brow, Inner Bone
  • Right Eye Brow, Center Bone
  • Right Eye Brow, Outer Bone
  • Left Eye, Upper Lid
  • Right Eye, Upper Lid
  • Mouth, Upper Lip Left
  • Mouth, Upper Lip
  • Mouth, Upper Lip Right
  • Mouth, Left Lip Corner
  • Mouth, Right Lip Corner
  • Mouth, Lower Lip Left
  • Mouth, Lower Lip
  • Mouth, Lower Lip Right
  • Jaw

Blendshapes Configuration

Alternatively, you can use blendshapes to drive the facial expressions. Blendshapes need currently always be selected manually. They can be found in the same Configuration section of the Head Target as the facial bones.

The following blendshapes are used by facial tracking for ARKit:

  • Left Eye Brow, Up Blendshape
  • Left Eye Brow, Down Blendshape
  • Right Eye Brow, Up Blendshape
  • Right Eye Brow, Down Blendshape
  • Left Eye, Eye Closed Blendshape
  • Right Eye, Eye Closed Blendshape
  • Mouth, Raise Left Blendshape
  • Mouth, Raise Right Blendshape
  • Mouth, Lower Left Blendshape
  • Mouth, Lower Right Blendshape
  • Mouth, Narrow Left Blendshape
  • Mouth, Stretch Left Blendshape
  • Mouth, Narrow Right Blendshape
  • Mouth, Stretch Right Blendshape