Walk.js 2.0 beta - [fixed: pre-rotations]


#1

Since the recent breaking changes, true high fidelity animation in javascript is now not possible. Before these changes, it was possible to import bvh files directly into javascript using walkTools. An animation pipeline had been established: bvh -> walkTools -> walk.js harmonic animation JSON file. As long as the avatar FBX specified pre-rotations, the pipeline produced near perfect results across all avatars. The harmonic animation system was then being used as a base for dance.js, whereby an avatar (or NPC) could dance in time to the music (or any other rhythm). A number of other ideas were being explored also.

However, since the breaking changes, further development of the harmonic animation system is no longer viable. Before I shelve the project, I have decided to put up one last beta in the hope that it provides the following:

1) A quick, easy way for developers to test new possible solutions (currently demonstrating the spurious values generated by the new MyAvatar.getDefaultJointRotation function) 2) A 100% clear demonstration of why the pre-rotations are essential in the context of avatar joint rotations in javascript.

Where we are now

The broken requirement for javascript animation is as follows: A reference pose (t-pose) is required. This means that when all the avatar’s joints are set to zero, the avatar takes on a t-pose. The importance of this requirement is described in detail in this post.

Steps to reproduce results below

1) Load walkTools using the URL given below - the walkTools toolbar appears at the top of your screen 2) From the toolbar, open the Editor and select 'Rotation' mode 3) Select 't-pose' from the 'Animation' drop down in the Editor 4) Click the 'Settings' button in the bottom right hand corner of your screen 5) Chose to apply no pre-rotations, 'HiFi' pre-rotations or 'davedub' pre-rotations and observe results.

no pre-rotations

Currently, applying zero rotation to all joints will put the avatar in this pose:

getDefaultJointRotation pre-rotations (aka HiFi pre-rotations)

If the new MyAvatar.getDefaultJointRotation values are applied, applying zero rotation to all joints will put the avatar in this pose:

I have assumed values returned by MyAvatar.getDefaultJointRotation are in radians as they are all very small values, so they have been converted to degrees. @hyperlogic: clearly, the pre-rotation values coming from MyAvatar.getDefaultJointRotation are not correct.

my naive pre-rotations

Using a very simple set of guestimate pre-rotations (see here), applying zero rotation to all joints will put the avatar in this pose:

However, it must be noted that these naive values do NOT work well, in that an imported JSON animation will NOT look the same as it did in Maya / Motionuilder / Blender - feet will be twisted / not flat on the ground, arms will collide with hips, fingers will be bent out of shape - if javascript animation is to ever be useful, we NEED the avatar’s pre-rotations to be honoured!

walk.js and walkTools URLs

Both walk 2.0 and walkTools can be run directly from URL:

 walk.js 2.0 beta: https://hifi-content.s3.amazonaws.com/dave/walk-beta/walk.js

 walkTools: https://hifi-content.s3.amazonaws.com/dave/walk-tools/walk.js

  • davedub

#2

Thanks for sharing this, it’s much easier to debug and address issues when there is working code to look at. As for the pre-rotation issue, I believe this is just a bug. MyAvatar.getDefaultRotation() returns a quaternion not euler angles. I’ve made a pull request for your project on github, which should address this issue.

Cheers,


#3

Just tried walkTools.js again – really cool stuff, including the UX design.

Using @hyperlogic’s patch it looks like a correct default pose now appears – at least when using a Maya-exported avatar (anything exported from Blender still seems SNAFU’d).

Couldn’t get animations to work in general – or even anything to “bounce around” if using system pre-rots. Maybe there are other euler->quat fixes needed in the scripts somewhere?

Experimentally it holds true that MyAvatar.getDefaultJointRotation(joint) == FBXJoint.preRotation * FBXJoint.rotation (and probably * postRotation, but so far that’s always zero). Note that these are the same numbers available in the automated JSON dump.

Anyway, since two or more scripts wishing to contribute sensible influences across joints still need to talk to each other directly – if you ever work on your walk tools again, maybe you could consider some kind of dynamic gesture overlay. Like letting other scripts use the Messaging.* API to speak to your animation machinery on-the-fly or something.

Because some day I hope our avatars can be taught to walk and chew gum… at the same time! :slight_smile:


http://www.jantoo.com/cartoon/33234524 via @jantoocartoons


#4

@hyperlogic - tysm for the PR, now it all makes a lot more sense :slight_smile: However, we do still have issues:

Testing the pre-rotation options

I’ve updated the S3 versions of walk and walkTools with your changes, run some tests and have put together a movie to show the results:

Notes on pre-rotation tests

Both the HiFi and naive pre-rotations both fail to give expected results across all animations; in particular, the HiFi pre-rotations work well for a t-pose, but look terrible when the walk animation is played. Conversely, the naive pre-rotations have issues on the t-pose, but work quite well for the walk animation.

Is it correct to assume that the pre-rotations exposed by MyAvatar.getDefaultJointRotation are generic, and do not represent the avatar’s true, unique pre-rotations, as specified in the avatar’s orginal FBX?

If so, I think these tests clearly demonstrate the need for each avatar’s true, unique pre-rotations (aka joint orients in Maya) to be honoured and pre-applied before any JS joint rotations are applied - i.e. any other substituted pre-rotation values will never work properly for all animations across all avatars.

Note on using walkTools

In the movie, the avatar view is rotated using the compass control on the Editor. To use the compass control on the Editor, first enable the independent camera by pressing the ‘5’ key. To return to 3rd person camera, press the ‘3’ key.

walk.js and walkTools URLs

Both walk 2.0 and walkTools can be run directly from URL:

walk.js 2.0 beta: https://hifi-content.s3.amazonaws.com/dave/walk-beta/walk.js

walkTools: https://hifi-content.s3.amazonaws.com/dave/walk-tools/walk.js


#5

100% agreed - whilst we are still very much still taking baby steps with JS avatar / NPC animation, it is really important to keep in mind that at some point JS developers are sure to want the ability to select and blend animation data from a wide variety of animation drivers. I made some notes on the subject here last July: https://alphas.highfidelity.io/t/js-feature-request-scripted-control-of-character-animation-sources/6851/6

I’d really like to find out what the issue is here - as u can see from this movie, all the animations should play with either naive or HiFi pre-rotations applied, albeit not with the same level of fidelity as before the breaking changes:

The only thing that I know is completely broken right now is the ability to import bvh files. The cause of this is described here: https://alphas.highfidelity.io/t/get-and-set-not-working-in-javascript-any-more/9490

This is something I intend to tackle if / when the pre-rotation issues have been resolved…


#6

We make sure that all joints are zero’ed out before export, this means the pre-rotations are exactly the same as the default pose. To be clear, for our standard avatars MyAvatar.getDefaultJointRotation() is identical to the pre-rotation.

As why the walk is different, I don’t know. Perhaps you’ve authored your procedural animations using the reverse engineered set of pre-rotations, which was slightly different then the actual pre-rotations, so now when you are using the correct ones it looks different.

When applying your curves in Maya does it look correct?


#7

Agreed :slight_smile: - I have now confirmed this by downloading a couple of avatars and inspected their pre-rotation values using the Open3Dmod FBX viewer and manually checked them against the quaternion values exposed by MyAvatar.getDefaultJointRotation. Thank you so much for taking the time to help me things moving again!

(note to @Menithal - Open3Dmod is a great little FBX viewer that displays the pre-rotations in both Euler and quaternion format, so might prove useful to you in the Blender avi quest?)

So it would appear that my animations DO have some sort of problem - which is really, really odd, as they worked perfectly before (i.e. back when pre-rotations were pre-applied).
Is it possible the old way Interface used to pre-apply the pre-rotations for JS animation had issues that I inadvertently worked around?

I’ve not yet made an exporter, so I can’t test this just yet. I don’t have a copy of Maya either, but I could make a bvh exporter and check my curves that way.

However, I think my best plan of action is to re-visit my animation production pipeline (bvh -> walkTools bvh converter -> harmonic animation file). The snag here is that that I was relying on a couple of THREE.js objects to do the conversion from bvh Euler angles to pitch/yaw/roll angles, but the THREE.js library is currently incompatible with HiFi (due to the recent removal of get / set functionality from the QT JS backend, see here for details).

So, moving forwards, I am now implementing a way of converting Euler angles to pitch/yaw/roll angles that doesn’t rely on THREE.js. I’ve started work on implementing the method described here: http://www.euclideanspace.com/maths/geometry/rotations/conversions/eulerToQuaternion/index.htm

Here I go again!

  • davedub

#8

… not entirely sure I understand what you’re aiming for, but one of the reasons I created glm-js was to help eradicate quaternion pain in JavaScript so maybe it can help here.

It doesn’t yet implement the whole enchilada, but what it does do is abundantly stable and unit tested – based faithfully on GLM C++, which is based faithfully on GLSL.

What I actually do with glm-js is wrangle and maintain (in parallel triplicate) a set of “identical” wrappers around THREE.js, gl-matrix and tdl-fast JS math routines – using all three at development time to cross-verify and prove 100% correctness of my consolidate glm-js API.

Anyway, here’s how to use glm-js for Euler/Quaternion conversions within HiFi:

> Script.include('https://cdn.rawgit.com/humbletim/54ba84a38d0600fa1661/raw/25bb203cbe917b9c48fb78f8b5300482539a6442/polyfill-hifi-glm.js');

> rot = glm.vec3( { x:30, y:0, z:0 } )
// fvec3(30.000000, 0.000000, 0.000000)

> q = glm.quat( glm.radians( rot ) )
// <quat>fvec3(30.000000, 0.000000, 0.000000)

> glm.degrees( glm.eulerAngles( q ) ).x
// 30

> glm.quat( MyAvatar.getJointRotation("RightArm") )
// <quat>fvec3(-11.507488, -0.609472, -1.103078)

> MyAvatar.setJointRotation("RightArm", q)
> glm.quat( MyAvatar.getJointRotation("RightArm") )
// <quat>fvec3(30.000000, 0.000000, 0.000000)

The “<quat>fvec(…)” output comes from pretty-printing into Euler/degrees – which I have happen by default (when values are coerced to strings) because that seems more humane.

Internally, however, quats are always ugly bastards (and always stored as radians):

> JSON.stringify( glm.quat( MyAvatar.orientation ) )
// {"w":0.20710207521915436,"x":0.00011860704398714006,"y":-0.9783193469047546,"z":0.000025108109184657224}

Per GLM conventions, to get at the Euler angles in degrees you could do:

> JSON.stringify( glm.degrees(glm.eulerAngles(glm.quat(MyAvatar.orientation))) )
// {"x":180,"y":-23.905160903930664,"z":-179.98611450195312}

Some other examples:

// HiFi "Vec3" or "Quat" values just need to be wrapped with the corresponding initializer
> front = glm.vec3(0,0,1).mul( glm.quat(Camera.orientation) )

// going in the other direction is even easier (because it's automatic)
> Entities.editEntity(uuid, { dimensions: glm.vec3(1), position: front, rotation: q }) 

// mix-and-match is fine too (but mind your sanity if switching too-arbitrarily)
> glm.vec3( Vec3.sum( Camera.position, front ) )

If you’d like to know more, just let me know. glm-js already supports a lot of GLM’s other features including vec2, vec4, mat3, mat4, matrix operations, swizzles and other neat stuff.

Also here’s a fun fact – (coincidentally) Interface.exe uses GLM C++ for nearly all of its complex math at the application level – and many of the math methods seen in scripting have evolved at the C++ level into simple, lightweight proxies around GLM functions.


#9

@humbletim - thank you very much for the code sample - I’ve just used it and have found it produces the exact same result as the method I came up with earlier today (below). It also produces the same result as when the old version of the walk animation is played (see movie below).

quaternion addition method of converting Euler rotations to quats:

Bvh.eulerRotationsToAngles = function(rotations, rotationOrder) {
    
    if (rotationOrder === "ZXY") {

        var quatZ = Quat.fromPitchYawRollDegrees( 0, 0, rotations.z);
        var quatX = Quat.fromPitchYawRollDegrees( rotations.x, 0, 0);
        var quatY = Quat.fromPitchYawRollDegrees( 0, rotations.y, 0);        
        
        var finalQuat = { x:0, y:0, z:0, w:1 };
        finalQuat = Quat.multiply(finalQuat, quatZ);
        finalQuat = Quat.multiply(finalQuat, quatX);
        finalQuat = Quat.multiply(finalQuat, quatY);
    
        rotations = Quat.safeEulerAngles(finalQuat);
    }
    return rotations;
}

movie showing where I am now
https://www.youtube.com/watch?v=6XZ7dmey7XQ

Before Interface stopped automatically applying pre-rotations, it was possible to export a bvh file with no prerotations from MotionBuilder and import it into walkTools with perfect fidelity. It was a happy day when I got that working - see here: yataa!

Since the change, what used to work no longer does - the paths of the feet are no longer parallel, giving the figure the appearance of ‘skating’. I saw this effect many times during previous development - it was an artefact of the pre-rotations being wrong. However, having experimented under the new conditions, I’ve become convinced of the following:

  1. The pre-rotations exposed by the new MyAvatar.getDefaultJointRotation are indeed a match for each unique avatar
  2. In theory (and in previous practice), exporting a bvh file minus it’s pre-rotations should play perfectly when imported into HiFi
  3. In practice, there is something terribly wrong! It is possible BOTH methods of calculating the quaternion from the Euler angles are wrong, but it seems unlikely.

I’ll get back on this in the new year - happy new year to one and all!

  • davedub

#10

Solved - always so simple in retrospect! (big thanks to @hyperlogic and @humbletim, couldn’t have done it without you!)

bvh importer / converter = working
walk.js = working
walkTools = working

In simple terms, I have avoided doing any unnecessary Euler / quaternion conversions and it’s all working great again!

applying pre-rotations correctly

Code snippet: [code] // after calculating jointRotations in the normal way: var rotationsQ = Quat.fromVec3Degrees( jointRotations ); if (avatar.isUsingHiFiPreRotations) { var jointNumber = MyAvatar.getJointIndex( jointName ); var preRotationsQ = MyAvatar.getDefaultJointRotation( jointNumber ); rotationsQ = Quat.multiply( preRotationsQ, rotationsQ ); } } // apply rotations MyAvatar.setJointRotation( jointName, rotationsQ ); [/code]

pre-rotations correctly applied

The plan now

  1. Prepare walk.js for PR and submit. This will represent the end of work on walk.js (for now).
  2. Continue where I left off with dance.js. The plan is to re-apply the harmonic system to an NPC that will dance in time to music.
  3. Get my Emotiv Insight working (I’m STILL awaiting support) and apply user’s mood to walk gait, as described here

Truly a happy new year!

  • davedub

#11

Now that caught my attention. Is there some JavaScript interface method that can do analysis of an audio stream?


#12

@Balpien_Hammere - not that I know of as yet. The prototype I’ve been working on simply samples the MyAvatar.audioLoudness each frame and very roughly detects peaks. It’s far from ideal, but with some filtering and bounds setting, I’ve managed to generate some reasonable results. I’m hoping to find a better solution within the (as yet unexplored) JS audio stuff…

The longer term goal for this project is to use depth camera / live mocap data / Leap output to determine the dance frequency. The user input might be from the user’s hips movement, or might just be tapping their fingers. The idea is inspired by (but not quite the same as) a scene in ‘Ready Player One’ - quoting:

I loaded up a piece of high-end avatar dance software called Travoltra, which I’d downloaded and tested earlier that evening. The program took control of Parzival’s movements, synching them up with the music, and all four of my limbs were transformed into undulating cosine waves.

The nature of my harmonic animation system allows for exact frequency matching with no interpolation errors. Generally, the system is ideally suited to any animation application that requires accurate frequency matching for smooth rhythmic / repeated action.

My idea is to eventually allow either user OR audio triggered dance synchronisation - but right now it’s very much in the feasibility study stage. It may be hopelessly ambitious, but I’m going to try anyway!