JS feature request: scripted control of character animation sources


#1

Please note: This is a draft proposal, intended for discussion before being submitted as a worklist job - any comments / insights / additions / corrections are extremely welcome!

@philip @leviathan - Please note that this feature would require work on Interface that I would not feel confident in taking on, so I won’t be bidding on this if it is accepted as a worklist job.

A scheme for managing multiple animation sources

Currently, there are a number of character animation drivers;

  • Hardware (HMD, Leap, Hydras, DDE, etc)
  • FBX files containing animation data loaded by Interface
  • JavaScript joint rotation / position commands
  • ?

However, the decision as to which animation driver takes priority when more than one driver is attempting to control avatar animation is, as far as is known, hard-coded in Interface and not controllable by JavaScript. Outlined here is a scheme that would enable a JavaScript programmer to take control of the prioritisation process. The scheme also allows for the blending of animation from different drivers.

Solution

To achieve the required level of control, a new Controller object is created and exposed to JavaScript. The following modes are defined for the ‘Controller.animationDrivers’ object:

Default Mode - Fixed priority

By default, the priority is fixed - e.g. hardware driven animation / joint motion takes precendence, followed by JavaScript, then FBX.

Script Control Mode

Rotation and translation data from hardware input and FBX sources are made available to JavaScript via the Controller.animationDrivers object on a frame by frame basis. No animation drivers affect character animation by default, so it is the responsibility of the script to check for hardware / FBX driven animation and to explicitly apply it to the avatar. Sample psudeo code for a JavaScript character animation controller might look something like this:

var animationDrivers = Controller.animationDrivers.requestControl();


if (!animationDrivers) {
    print('Cannot obtain animation control - does another script already have control?');
}

Script.update.connect(function(deltaTime) {

    for (joints in skelelton) {

        if (animationDrivers) {
        
            // read any rotation / translation data from each of the hardware sources
            // read any rotation / translation data from FBX file loaded by Interface
            // calculate JavaScript animation rotation / translation
            // blend or prioritise animation data however you see fit
            // apply final rotations / translations to avatar

        } else {            

            // calculate JavaScript animation rotation / translation
            // apply final rotations / translations to avatar

        }
    }
});


Script.scriptEnding.connect(function() {

    // set the animation driver mode back to default
    Controller.animationDrivers.releaseControl();
});

When reading in the rotation / translation data for individual hardware sources from the animationDrivers object, the data would be supplied separately for each driver. The purpose of doing so would be to give the JavaScript programmer the ability to offer the end user fine grained control over animation options - for example, the user may have a Leap connected whilst using Hydras - in this situation, the user could be offered a ‘Hydras priority’ option in the script’s settings, thus removing the need to keep plugging and unplugging hardware. Similar options could be offered for other potentially conflicting hardware configurations like DDE / Oculus tracker movement.

Applying the above to walk.js

For the most part, hardware driven animation would be given priority when the walk script is running. However, there are circumstances when it would be desirable to blend both the scripted and hardware driven animation.

Spine animation whilst walking

To achieve compelling results when walking, spine joint animation could be a blend of both hardware and scripted animation in a manner similar to this:

(Given percentages are guestimates. Final values would be tweaked by trial and error)

Spine2: 100% controlled by any hardware driven animation
Spine1: 60% controlled by hardware, 40% controlled by walk script
Spine: 25% controlled by hardware, 75% controlled by walk script

Intermittent use of arms hardware drivers

When using, for example, the Hydras, the user may put the controllers down. In this situation, it might be desirable to have the avatar’s arms gently blend back to scripted control whilst the Hydras are static.

In Summary

The scheme described above would present design options to a JavaScript programmer that are essential if character animation is to be seamlessly driven by multiple sources in an aesthetically pleasing manner.

  • davedub

#2

So FBX animations are also considerd Controller based animation? Interesting…

As an additional suggestion to this:
The influence values for different parts of the skeleton should also be adjustable from the defaults: this would also allow for entities to be controlled in various ways, be it via script or animation files.

For example using your spine as an example by setting Prioritization levels:

Skeleton.setScriptInfluence(entityId,{
    Spine: 0.25,
    Spine1: 0.6,
    Spine2: 1
});

Problem would be solve this issue between many scripts…

Also: as a side note, all skeleton / bone data in MyAvatar should have their own Interface to allow manipulation of bones, as it should be applicable to both entities and avatars (I’d suggest naming such: Skeleton)


#3

@Menithal - Thank you for the feedback. In reply:

Influence Values

It seems I was’t clear in explaining how the influence / blending works. To better explain:

In animation ‘Script Control Mode’, all animation drivers have zero influence - i.e. once this mode has been enabled, no animation drivers of any sort (hardware, FBX) have any effect on Avatar animation. The rotation and translation data for all drivers is made available to the script, but it is the responsibility of the script to actually apply the data on a frame by frame basis. Given this scheme, the influence of each driver is determined by simply deciding which values to apply, and to what degree.

To give an example:

The user has an HMD connected and runs an animation script that requests control of the animation drivers by enabling ‘Script Control Mode’, as described in the OP. The following process then takes place repeatedly for each joint (on Script.Update.Connect)

  1. The script reads the HMD’s rotation / translation data from the Controller.animationDrivers object
  2. The script calculates it’s own transation / rotation values
  3. The script then blends the two sets of values. This is where the influence percentages are set
  4. The script applies the final, compounded rotations / translations to the avatar’s joint.

Given this fine grained method of setting the influence of each individual animation driver, for each joint, on each frame, I’m not sure why the Skeleton.setScriptInfluence call you suggested is required, or where it fits into the scheme.

Applying the scheme to Entities

Your suggestion about extending this scheme to cover all Entity animation sounds great! I’ve had no experience animating Entities as yet, but I note that in addition to scripted translation and rotation, it’s also possible to apply FBX animation to Entities, so the same need for control over prioritisation and blending is present. I think the scheme described above could very easily be applied to Entity animation too, the main difference being that in either mode, hardware drivers would (by default) have no effect on the Entity’s animation.

FBX files as animation driver

It’s worth bearing in mind that it’s likely that other animation drivers could exist - e.g. Avatar enters a ragdoll state, or (hopefully with a future physics system / IK rig upgrade) a projectile hits a limb and creates physics based joint movement. It would make the JavaScript programmer’s life easier if all current and future animation drivers presented their data in a uniform way via a single object, hence the idea of creating the Controller.animationDrivers object.
I can see that parenting Interface’s FBX file player to the Controller object could be seen as a bit ambiguous, but if we consider anything that drives character animation as a Controller, regardless of whether it is hardware or software based, it kinda makes sense - but I’m very open to suggestions as to which object the animationDriver is parented to (if any), how it is obtained and what it is called.

As a JavaScript programmer focussing on animation, all I really want is access to each and every animation driver’s rotation and translation values, and complete control over how they are applied…

Side note

I’m not quite sure what you mean here - we already have MyAvatar.setJointData etc for joint manipulation - could you explain some more?

  • davedub

#4

In short: the name MyAvatar would imply that Avatars can only have joints. but Entities can also have them… its just a reshuffling of the scripting interface.


#5

YEs, but where is getJointData? How can a script react to a physical events that’s slammed an avatar onto a surface and do an ‘autonomic’ response? In an RP, how can a script amend where some other procedural animation has done a sword swing which has landed on armor? Those sort of things.


#6

A revised scheme for managing multiple animation sources

Taking in comments from @Menithal and @Balpien_Hammere (thank you both for the feedback), I have revised the proposal and broadened it’s scope.

As before, none of my proposed syntax / structure is intended as definitive - my intention is to communicate a set of requirements for (and from the perspective of) a JavaScript programmer obsessively chasing the goal of the most natural animation possible…

As usual, all comments, insights, additions or corrections are extremely welcome!

Introduction

There are a number of potential animation drivers that can drive Avatar or Entity animation:

Avatar animation drivers

  • Forces exerted by physics (special case, see below)
  • Hardware (HMD, Leap, Hydras, DDE, etc)
  • FBX files containing animation data loaded by Interface
  • JavaScript joint rotation / position commands
  • Ragdoll animation (special case, detail below)

Entity animation drivers

  • FBX files containing animation data loaded by Interface
  • JavaScript joint rotation / position commands
  • Forces exerted by physics

However, the decision as to which animation driver takes priority when more than one driver is attempting to control Avatar / Entity animation is currently hard-coded in Interface and not controllable by JavaScript. This situation is already proving problematic, as currently most hardware driven animation is ignored if JavaScript joint rotation is applied (e.g. HMD upper body movement is ignored if walk.js is running).

This proposal / feature request describes a scheme that enables the JavaScript programmer to detect activity from different animation drivers, prioritise them and finally blend them together programmatically in any way they see fit.

Scope and Assumptions

It is important to note that Avatar / Entity motion is not dealt with here - this proposal deals with the animation (i.e. the deformation) of an Avatar / Entity, not changes in root position. Changes in root position are, like the prevention of surface intersection on collision, assumed to be the responsibility of Interface.
However, it should be noted that small root offsets are (and look set to continue to be) necessary for character animation (e.g. hips sway whilst walking).

Proposed Solution

Outlined here is a scheme that would enable a JavaScript programmer to take control of the animation driver prioritisation process and allow for the blending of animation from different drivers. The scheme is further extended to allow for animated reactions to physical forces. To achieve the required level of control, a new AnimationDrivers object is exposed to JavaScript. The AnimationDrivers object would be a child of the MyAvatar, Avatar or Entity objects. Each AnimationDrivers object would have two modes:

Default Mode - Fixed priority

By default, the priority is fixed. Blending between animation drivers is not possible. The exact priorities are to be decided, but for an Avatar a sensible scheme might be (from highest to lowest priority): Physics, Hardware, JavaScript, FBX.

Scripted Control Mode

Animation data is made available to JavaScript via the AnimationDrivers object on a frame by frame basis. None of the animation drivers actually drive any joint animation in this mode, so it is the responsibility of the script to check for active AnimationDrivers and to explicitly apply the animation to the Avatar / Entity. Sample psudeo code for a JavaScript character animation controller might look something like this:

var animationDrivers = MyAvatar.animationDrivers.requestControl();


if (!animationDrivers) {
    print('Error: Cannot obtain animation control - does another script already have control of AnimationDrivers for this object?');
}

Script.update.connect(function(deltaTime) {

    for (joints in skelelton) {

        if (animationDrivers.hasData()) {

            // 1) poll AnimationDrivers for current animation driver data
            // 2) calculate any JavaScript animation rotation / translation
            // 3) blend or prioritise animation data from all sources however you see fit
            // 4) apply final animation values to Avatar

        } else {            

            // calculate JavaScript animation rotation / translation
            // apply final rotations / translations to Avatar
        }
    }
});

Script.scriptEnding.connect(function() {

    // set the animation driver mode back to default
    MyAvatar.animationDrivers.releaseControl();
});

Each animation driver would supply it’s animation data separately. Each driver would hold joints and their animation data in an array. i.e.

MyAvatar.animationDrivers.driver['Leap'].joint['RightArm'].rotation
MyAvatar.animationDrivers.driver['Leap'].joint['RightArm'].translation
MyAvatar.animationDrivers.driver['Leap'].joint['RightArm'].force

Physics as an Animation Driver

Whilst it is desirable that all animation drivers present their data in a uniform way (i.e. joint rotation or translation data), in the case of physics animation drivers, any forces acting on the Avatar / Entity would be made available to JavaScript as a vector. This force could represent, for example, the result of a collison.

However, with the current collision hulls employed by Interface, a force only acts on the Avatar / Entity as a whole, so such forces only affect motion, so there is currently no scope for physics based animation.

To provide a way of animating reactions to physics forces on different parts of the Avatar / Entity, collision hulls created by the convex decomposition of a surface could be employed (see https://code.google.com/p/v-hacd/).
The ability to create collision hulls using convex decomposition is part of Bullet physics and is already scheduled for implementation in Interface. Once in place, physical forces could be detected by different parts of the Avatar / Entity, thus creating the conditions necessary for animating reactions to physical forces.

Collision hulls created by convex decomposition of a surface

Simple example

Using the example of scripting the animation for an Avatar punching a wall (for simplicity):

  1. Script action during the frames up to the moment of impact
    The script checks for animation driver input and finds none. The script rotates the Avatar’s arm to animate the arc of a punch.

  2. Script action during and after impact
    On the first frame after impact, the script checks for animation driver input and discovers a force acting in the opposite direction to that of the punch’s arc on the Avatar’s hand joint. The script then decides to stop trying to animate the arc of the punch, and to react to the collision - the arm could be animated to just stop where it is, to enable ragdoll mode in just the arm to produce a natural reaction to the force or the entire Avatar could be animated to fall to the floor, clutching his hand in pain.

In the case where the JavaScript programmer choses to ignore the force and continues trying to animate the arc of the punch, Interface would again detect the collision on the next frame and prevent the intersection of the Avatar’s hand and the surface.

Ragdoll animation

To further enable compelling, natural movement, it would be desirable for the JavaScript programmer to have access to animation data representing ragdoll animation (i.e. data representing what would happen to each joint if it were subjected to physical forces like gravity and the result of any momentum). For example, if the programmer wanted to have the Avatar simply collapse on the spot, he would read the animation data presented by the MyAvatar.AnimationDrivers.drivers[‘ragdoll’] driver and fade it in over a number of frames.

The ability to apply ragdoll data only on specific body parts would enable reactions to collisions, such as ‘tripping up’ if a small obstacle was hit whilst walking, or flailing limbs when slammed into the ground.

To avoid wasted processing, if the ragdoll animation data is not required, it would likely be desirable that Interface not waste time calculating values. For this reason, it’s suggested that ragdoll animation data is only made available when requested by the JavaScript code, for a specific number of frames. This would mean a one frame delay between the script needing the driver’s data and getting access to it, but it seems unlikely this one frame delay would have any sort of negative visual / aesthetic impact. The code might look something like:

var NUM_RAGDOLL_FRAMES = 120;
MyAvatar.animationDrivers.drivers['Ragdoll'].requestData(NUM_RAGDOLL_FRAMES);

Applying the above to walk.js

For the most part, hardware driven animation would be given priority when the walk script is running. This would be true of Hydras, Leap, etc. However, there are circumstances when it would be desirable to blend both the scripted and hardware driven animation:

HMD induced spine animation whilst walking

To achieve compelling results when walking, spine joint animation could be a blend of both hardware and script driven animation in a manner similar to this:

(Given percentages are guestimates. Final values would be tweaked by trial and error)

Spine2: 100% controlled by any hardware driven animation
Spine1: 60% controlled by hardware, 40% controlled by walk script
Spine: 25% controlled by hardware, 75% controlled by walk script

This blending would take place in step (3) in the pseudo-code example above.

Intermittent use of arms hardware drivers

When using, for example, the Hydras, the user may put the controllers down. In this situation, it might be desirable to detect the Hydras rest position and have the avatar’s arms gently blend back to scripted control.

In Summary

The scheme described above would present design options to a JavaScript programmer that are essential if character animation is to be seamlessly driven by multiple sources in an aesthetically pleasing manner.
  • davedub