Inverse kinematics for arms in Unity Free/Indie

I have been playing and struggling with body movements the last few weeks. My body movements are mostly driven by the Razer Hydra. Recently I found out the a lot of information can be deduced from this device. I will write down the results in a series of posts in the coming weeks. This week I start with the basics: inverse kinematics for the arms.For this to use, you do not need to understand the mathematics behind inverse kinematics. We just want it to work for us and I will share that with you here. If you want to know more about the background of inverse kinematics and see an application on a different part of the port, see my second post on inverse kinematics: Inverse Kinematics part 2: bending the body forward based on hand positions.

First, a short explanation of forward and inverse kinematics. The easiest of the two is forward kinematics: the inputs drive the joints, which results in a certain gesture. I do use that for driving the hands: for example, the analogue trigger on each hydra device determine the bending of the fingers: the further I press the trigger, the more the fingers bend. This is straightforward: the input determines the bending and that is all

It this were inverse kinematics I would set the desired location at which the fingers should be. The inverse kinematics mechanism should then determine the bending of the fingers (and hands, and arms, and torso, and legs…)  needed to reach that spot. This is much more complex.

Unity Pro has built-in kinematics for arms and legs, but I have never used it. The reason is that I want to understand the mechanics (which I still do not) and extend it (which I did). Fortunately, the Unity Asset Store has a free implementation of Inverse Kinematics. This is my starting point.

The package contains 3 implementations inverse kinematics. As I am doing all my code in C#, I took the ‘BrunoFerreira’ variant of the implementation. Although the use of the inverse kinematics is not difficult, some attention should be put to set it up.

MeHierarchyBut first a word on my game hierarchy. I have split the player geometry into two parts: the RealMe and VirtualMe. The real me deals with the real players body: it contains sensor information like the input of the Razer Hydra and Oculus Rift, the cameras to render for (or in simple words, the screen of the Oculus Rift) but also a simple representation of the real body (I will come back to this in a later post). The virtual me contains the mesh of the players avatar and some additional information. Apart from that, we need a number of functions which maps the RealMe to the VirtualMe and vice versa. The inverse kinematics function is one of these.

The inverse kinematics function is placed directly within VirtualMe. You can see it in the hierarchy at the right, named ‘IK_L’ and ‘IK_R’, one for each arm.

The VirtualMe mesh originates from MakeHuman and is adjusted in Blender after which I imported it in Unity3D. Nothing fancy yet: it is the default MakeHuman character. I still need to spend more time in those tools to make proper figures.

The VirtualMe from MakeHuman

The VirtualMe from MakeHuman

VirtualMe comes with a rig: a separate mesh representing the bone structure. This structure determines where the figure can bend, like in a real body. You can see the bones in the mesh picture, but also in the hierarchy, starting with ‘MeRig’. The important part of the rig for now is the clavicle (the shoulders and arm). We need to assign those to parameters of the inverse kinematics function. The Upper Arm, Forearm and Hand are set to the corresponding rig parts.

IKinspector

Then we come to the targets: these are the points to which the arm should move: we have two targets for each arm: the (hand) target and the elbow target. To start with the second: it is a known fact that in theory there are two ways an arm can bend for the hand to reach a point. Actually there is only one, as the elbow is restricted in the ways it can bend. For me, it works well to have the targets besides the body: the position where the elbows are when the arms are straight along the body.

The targets are simple cubes which are normally not rendered. In case of the elbow targets they do not even move: they should always stay besides the body.VirtualMeWithTargets

The hand targets do move of course and are driven by the positions and rotation of the Razer Hydra devices. In order to achieve that, you need to put the ‘Sixense Hand Controller’ script on those targets, with a sensitivity of 0.001. But if you would use the hand targets directly, you will see that the positions are set to the wrists instead of the hands. This is because the VirtualMe hand origin position is at the wrist: this is the point around which the hand rotates. To compensate for this I create a additional child game object for every hand target, called the Palm Target. It is offset in the Y direction by 0.1 cm (in the figure, this is 1 units, as the target is scaled by 0.1). We use those Palm Targets as targets for our script.PalmTargets

That is basically it! Ensure debug is disabled (this will draw helpful lines in the scene view), and enabled is enabled and keep transition to 1 (I haven’t tried a different setting yet) and it should work.

There are however a few small flaws in the code which I corrected.

  1. The arms may rotate furiously while moving your hands. This can be solved by uncommenting/commenting the ‘other’ LookAt function line below:
    //Work with temporaty game objects and align & parent them to the arm.
    transform.position = upperArm.position;
    transform.LookAt(targetPosition, elbowTargetPosition - transform.position);
    upperArmAxisCorrection.transform.position = upperArm.position;
    upperArmAxisCorrection.transform.LookAt(forearm.position, transform.root.up);
    //upperArmAxisCorrection.transform.LookAt(forearm.position, upperArm.up);
    upperArm.parent = upperArmAxisCorrection.transform;
    forearmAxisCorrection.transform.position = forearm.position;
    forearmAxisCorrection.transform.LookAt(hand.position, transform.root.up);
    //forearmAxisCorrection.transform.LookAt(hand.position, forearm.up);
    forearm.parent = forearmAxisCorrection.transform;
  2. The (magic) ikAngle float may return NaN (not a number) in certain cases. If this happens, an error will appear in the console and your arm may disappear into nothingness. Correct this by the following additional check for NaN after:
    //Apply rotation for temporary game objects.
    upperArmAxisCorrection.transform.LookAt(target,elbowTarget.position - upperArmAxisCorrection.transform.position);
    if (!float.IsNaN(ikAngle)) {
        upperArmAxisCorrection.transform.localRotation = //.x -= ikAngle;
            Quaternion.Euler(upperArmAxisCorrection.transform.localRotation.eulerAngles - new Vector3(ikAngle, 0, 0));
    }

So those are the basics for inverse kinematics. Keep following me for improvements on this including kneeling, turning and bending.

Instead of using the basic asset from the Asset Store, you can also try my own free asset using inverse kinematics for arms:

More information can be found here.

Posted in HowTo Tagged with: , , ,

Leave a Reply