MyAvatar animation JavaScript API changes


#1

In the last few months we’ve been overhauling the existing animation system used for avatars and model entities. We’ve moved to a blend-tree style of describing the avatar’s animation state machines. Eventually we hope to have all the capabilities of this system documented and exposed to JavaScript, but we are starting with some of the existing functionality first and we’d like to get your feedback.

First off, there will be a new methods to override the animation currently playing on the avatar with a custom one. This will be a complete override, any IK for hand controllers/head tracking will be disabled. Once the animation is complete you can restore the MyAvatar back to it’s normal state animations.

MyAvatar.overrideAnimation(url, fps, loop, firstFrame, lastFrame);
MyAvatar.restoreAnimation();

These replace MyAvatar.playAnimation and MyAvatar.stopAnimation.

Next, there will be new methods to override a specific “role” of the avatar, for example, “idle” or “walk” with a custom animation that you provide. This can be used to customize one particular aspect of an avatar without having to replace all of the animations.

MyAvatar.getAnimationRoles();
MyAvatar.overrideRoleAnimation(role, url, fps, loop, firstFrame, lastFrame)
MyAvatar.restoreRoleAnimation(role)

The existing functionality of MyAvatar.setJointRotation will be preserved, however the coordinate space will be different, and I hope, more predictable. Instead of occurring in the pre-rotated frame, they will be in object space. Hopefully, this will simplify the process of setting joint rotations manually.

There is a WIP github pull request with the upcoming functionality. https://github.com/highfidelity/hifi/pull/6461

Any feedback or concerns?


#2

I would like to know if the PrioVR SDK has been taken into consideration? The videos High Fidelity created in the past used a PrioVR to pre-record animation. I intend to use the PrioVR in real-time to drive my avatar. I have only looked at a deprecated (YEI tech) SDK and could use High Fidelity confirmation that this hardware will be supported “out of the box” in much the same way the CV1 and the Vive are anticipated.

Can you inform us @hyperlogic whether or not the PrioVR will be plug and play with High Fidelity? Thanks.

http://www.priovr.com/


#3

We currently do not have out of the box support for any full body sensor trackers like PrioVR and Perception Neuron. However, the new animation system has much more advanced IK capabilities, which can be used to drive the avatar via sensor data. It is possible to modify the C++ source to integrate the PrioVR SDK to drive the new IK system.


#4

@hyperlogic @howard.stearns @ozan - I’ve been busy with other projects, just got back to looking at this.

With a few changes, this will work ok for walk.js, but we will need access to each avatar’s unique pre-rotations so that they can be applied. As I understand it, when all joint rotations are zeroed out the avatar should be in a t-pose. Currently, zeroing out all joints will result in this:

Image showing all joints with zero rotation applied

So, imagine if I wanted to animate a t-pose in JS - I would need to know all the pre-rotations for that specific avatar and apply them to get the correct t-pose. Currently, I am unable to continue with my animation projects until I either get access to each avatar’s pre-rotation values or JS animation is put back into the pre-rotated frame.

In all honesty, I can’t see a good case for animating in object space, as doing so will not solve the issue of Blender avatars being exported without their pre-rotation values intact.

I hope this is useful,

  • davedub

#5

Yes you are correct, after these recent changes, if you want to put the character into a t-pose. You would have to know the joint rotations used for the t-pose, and set them directly. Setting all joint rotations to zero no longer will put the model into the T-pose.

You should be able to read these orientations in blender.

My question, though, is why would you want to put the model in a T-pose in the first place? How is this useful?


#6

Also, one comment about your point:

I all honesty, I can’t see a good case for animating in object space, as doing so will not solve the issue of Blender avatars being exported without their pre-rotation values intact.

I disagree, animating in the non-pre rotated frame/aka object space does solve the issue with Blender avatars. Now calls to MyAvatar.getJointRotation() and MyAvatar.setJointRotation() are in a consistent coordinate frame. And any procedural animation done in JavaScript can be done without having to know what the pre-rotations are or (in the case of Blender) if they are not there at all.

How are the pre-rotations useful to you?


#7

@hyperlogic Thank you for the reply. I’ll try to explain the need for the pre-rotations to be already applied so that zeroing out the joints puts the avatar in a t-pose (aka reference pose, aka bind pose).

As with regular FBX based animation, a reference pose is necessary so that all JS animation has a standard starting point. If there is no t-pose / bind pose / reference pose, the following issues arise:

  1. Because the pre-rotation values vary appreciably between different avatars, getting consistent results without them is impossible. Before this PR, where @howard.stearns introduced the t-pose-as-the-bind-pose requirement, I could never get consistent results across all avatars. Since HiFi adopted the t-pose as the bind / reference pose, JS animation has been consistent on all avatars that comply to the avatar standards.

  2. Whilst it is possible to manually read the pre-rotation values from Maya, I have done it and it’s a painstaking task (see here: 60+ joints, each with 3 values read from each joint means there are over 180 values to manually copy for each and every avatar).
    Moreover, if pre-rotation values were not available to JS, I’d have to manually create a database of pre-rotation values for every single avatar to get the previous, consistent results.

  3. Given the situation where the t-pose pre-rotation values were accessible, I would still have to apply them every frame in JS before applying the actual animation values. Given that JS code execution is generally more computationally expensive, it makes more sense to me to apply them under the hood in C++, as I cannot imagine a situation where using the pose pictured in my previous reply as the starting point for any JS animation task would be advantageous.

In summary:

  • Without pre-rotations being already applied, I do not have a reference pose / t-pose / bind pose to work from, so consistent results across different avatars are not possible.

  • Without access to the pre-rotation values for each unique avatar, I cannot put the avatar into a reference pose before I start applying animation data.

  • If the pre-rotation values were available to JS, there would be an extra set of calculations to make for every axis on every joint for every frame.

  • Pre-rotation values form part of each avatar’s individual specification. Ideally, animation files (FBX or JS) would work nicely for all avatars. If we are aiming for the situation where all JS animations fit all avatars, specific pre-rotations need to be applied before any general animation data is applied (as is the case with FBX animation).

I hope I’ve explained this ok!

  • davedub

#8

That out-of-band approach is probably the most reliable and painless way to go here…

And having access to such metadata as a “source of truth” provides for a virtual bulkhead – helping avoid all abstraction layers becoming flooded with doubt whenever problems emerge.

I used to generate a similar JSON representation with an improvised command-line tool – using HiFi’s FBXReader.cpp to read a model into memory and then immediately dump its structure.

I’d be curious to compare whether the values match your manual tabulations in Maya – if that’s of interest then here is a non-JSON variation with quats and mat4’s pretty-printed for easier study.

And on a side note – “pre-rotation” seems to be an emergent property: in actual FBX there is Lcl Rotation (local rotation) – but PreRotation always appears to be zero. The C++ code does have a pickup for FBX PreRotation – I’ve just never seen an .fbx having values for it (but this might just be because Blender export literally hard-codes it as zero).


#9

Hi @humbletim - thank you for the reply.

I can’t really agree with you here, as I think the most reliable, painless way forward is to use pre-rotations (aka joint orientations in Maya) to get avatars into their bind / reference pose as they are intended. It’s a great system, it is the standard way in which all regular FBX animation is done and used to work extremely well for JS animation.

The root of these issues

All of these issues stem from the fact that Blender fails to export pre-rotations. As you pointed out - they are always zero because if Blender finds any when importing an avatar, it (wrongly) adds them to any local rotation and exports the result in the Lcl rotation field. To compound this, from reading @Menithal's posts, the pre-rotation values seem difficult to interpret - e.g. simply making sure there is no local rotation, then manually swapping the Lcl rotation with the pre-rotation fields in the exported FBX does not give good results. AFAIK, nobody working on HiFi has solved this yet. I believe the best (and only correct) solution would be a modified FBX exporter plugin for Blender that does honour pre-rotation values.

Hacky solutions

Simply taking pre-rotations out of the equation for JS animation may have fixed the issue of getJointRotation and setJointRotation being inconsistent, but has made consistent, high fidelity JS animation across all avatars virtually impossible.

Correct solution

Using pre-rotation values is the standard, correct way to provide a standard reference pose for all animation. This is how FBX animation works, and is how JS animation used to work very well.

In summary

Simply ignoring pre-rotations because Blender does not (yet) honour them is a hack, not a fix! I am quite certain we are taking the wrong direction here.

#10

@davedub Thanks for the clarification and expressing your concerns.

There are lots of things to discuss here. It’s clear that exposing a reference pose to JS is desirable. However we’ve run into some issues with this. We’d like this reference pose be something that is consistent across authoring packages. However, Blender doesn’t support pre-rotations/a.k.a. “Orient Joint” internally, at all. So it’s a larger issue than just making modifications to the FBX exporter.

We could use a bind pose as a reference, however, we’d like to allow content authors to use different bind poses for each mesh piece. So, what we’re left with is the default pose, i.e. the orientation of all the joints when you display the model without animation.

We’re using this default pose internally to measure joint lengths and other skeleton information for eye-tracking, head-tracking and IK. So we can just expose it to JavaScript perhaps something like this:

MyAvatar.getDefaultRotation(i)
MyAvatar.getDefaultTranslation(i)

You can use this default pose in JavaScript and compose a full rotation by combining this with your existing animation data.

It is understood that this will be slower to do these computations in JS instead of C++. If this becomes a common use case we can expose new methods that take defaultPose relative transformations.

Thoughts?


#11

Hi @hyperlogic,

Thank you for letting my voice be heard on this, is very much appreciated.

Internal default pose

Regarding the ‘internal default pose’ you mentioned, can I confirm that it is a t-pose, constructed from the specific pre-rotations for each unique avatar? If so, exposing these values to JS would work fine and I’d have walk.js (and dance.js, currently under development) working again within an hour or two :slight_smile:

Pre-rotations in Blender

In light of this, I’ll step back from the idea of finding a way to make Blender honour pre-rotations and think on another solution. When you say:

How strict is this requirement? I only ask, as if the avatar creation / import process could deviate just a little for Blender, the solution I propose below might work for everybody.

Possible 'catch all' solution

When importing a Maya / standard avatar, am I right to say that the specified pre-rotations (joint orients) are used to create the internal default pose unique to this avatar? If so, could there not be functionality on avatar import that basically says 'if the pre-rotation values specified for the joints are all zero, create an internal default pose based on the local rotations instead' for Blender avatars?

In this way, avatars could be exported from Blender with their local rotations describing a t-pose and we’d have circumnavigated Blender’s shortcomings w.r.t pre-rotations.

Different bind poses

I’m not quite clear on why:

Out of curiosity, how / when would different bind poses for an avatar be useful?

Cheers!

  • davedub

#12

Actually I was thinking more about the pain expressed by various alphas in context of rewrites/rework across the last six months – which to me indicates a root cause problem that continues to go unnoticed/unsolved.

I’ve examined the asset pipeline from Blender into Interface in excruciating detail – including the source code on all sides. The rotation situation is not as simple as it seems to currently be assumed.

If nothing else this matters to the future. For example, it seems highly-improbable that “avatar recordings” are going to be capable of surviving any better than simple avatar animations already do not.

But to me the most unfortunate aspect of this situation is watching as native works of art are essentially and unnecessarily turned to dust. Even if unintentional – that’s still a shame. And I was hoping our grandchildren would be able to travel back in VR to the “now” and get an actual feel for what the early metaverse was like. I think there are ways to approach the problem much more systematically and with much less waste.


#13

Hi @humbletim,

I too am starting to feel my project is turning to dust. It’s a huge blow - I have already started a second project (dance.js) that uses my harmonic animation system to get avatars dancing in time to the music. I have an Emotiv Insight in the mail right now that I intend to use to make gait adjustments to walking and standing animations that reflect the users emotional state. I have a PR for walk.js that uses full body harmonic animation that has been ready for release for months, but S3 woes and now this issue have meant that possibly the best work I have ever done may never be released.

All of that said, I still believe HiFi to be the future of VR, and will keep trying to contribute. But if we can’t find a solution to this pre-rotation issue then everything I’ve made over the last year+ will be for nothing.

I’m guessing that I’m missing some fundamental detail in my Possible ‘catch all’ solution above. I’d very much appreciate your input on what needs to happen.

  • davedub

#14

You’re not alone on that. @humbletim with plenty of programmers, artists, and like-minded VR enthusiasts, let’s get more eyes on the problem.

I’m certainly not suggesting you can’t handle it; just that, maybe taking the thread to the “next level” could help. Code examples, explanations. Write for your target audience. I promise to drop out once the context goes over my head. :smiley:

I for one intend to use a PrioVR (hitting production manufacturing soon WOOT!) to drive my Avatars; so one would think I have no vested interest in this thread but under consideration most (if not all) of my Avatar’s would be constructed using Blender this “situation” as I call it has my utmost attention.

I’m not suggesting we pass blame here or there, to Blender, to HiFi… just ya know… fill us in; what’s going on? Maybe we (all of us) can help.


#15

Put simply: Avatar rotations have become chaotic.

And by chaotic I mean rotations exhibit unpredictable behavior that can even appear random and often suffers great sensitivity to small changes in conditions.

This kind of chaos is to me a slightly-different problem from say the kind rigid body simulation or human controller inputs introduce.

Below is a partial list of troubleshooting questions. There are many permutations possible and different causes tend to exhibit same or similar symptoms, making effective diagnosis from something like a screenshot impractical. (note: after the list I also propose a potential solution path)


  • FBX:
  • Which tool was used to export?
  • Which version of that tool was used?
  • Which export settings were used?
  • Which scene objects were included?
  • When was the import done? (Which HiFi build/version?)
  • Which C++ code path did the import travel across? (ie: did it take any detours like the Mixamo workaround?)
  • Are you 100% sure you selected the right FBX export and were looking at the corresponding FBX import? (or, how do you know you weren’t observing a previously-cached version?)
  • FST:
  • How are joints mapped in your .fst file?
  • Are any translations, rotations or scales applied in the .fst?
  • Are you 100% sure your .fst doesn’t result in an old, cached avatar model being used?
  • Avatar:
  • Are animations involved? (How/where/when are they applied?)
  • Are shape keys involved?
  • What kind of mesh structures are involved? (And how do they compare with reference models?)
  • What overall mesh proportions are involved?
  • What overall armature proportions are involved?
  • How have bone rolls been configured?
  • What camera angles and projection modes were used to perform visual inspection?
  • Are vertex weights affecting any perceived joint rotations?
  • Blender:
  • Is your armature object rotated, translated or scaled?
  • Are pose bones rotated, translated or scaled? (And how does that compare to the “default pose?”)
  • Are any armature bones “disconnected” or otherwise translated from parents?
  • Are any armature-related objects nested in Empty’s or other virtual containers?
  • Maya:
  • (note: I don’t have Maya, but imagine a similar set of questions to Blender here…)
  • Interface:
  • Was reconstruction of the armature concept from FBX bone chains 100% successful and accurate? (ie: How does the reconstructed armature compare numerically to the original armature?)
  • Are IK animations enabled (or otherwise affecting results)?
  • How about physics / collisions?
  • Or other animation / blending?
  • Or scripted avatar joints?
  • Or hardware devices?
  • Or other settings that affect rotation results?

Collecting test cases doesn’t seem to have been very effective in the past; however, it still seems like the best way we alphas could invest time in the near term. I will suggest combining such an effort with a TBD methodology (to help capture effective test cases) and then literally that the samples be wired into an automated regression testing system.

Such testing could be undertaken entirely by alphas for now, saving HiFi the initial trouble. Later maybe the build bot could be harnessed – that way, within minutes of PR merges, anybody who cared could know when avatar rotations regressed (and in proximity to likely code causes).


#16

Edit: was supposed to post this days ago but the server was down then, forgot about it and now looked at the thread again to notice it here:

Ay. what @hyperlogic mentioned is true:

Blender uses quaternions by default, AND the bone system works differently:
In Blender Bones are defined with vector and a roll, instead of scale and rotation (at least this is how I remember it working in Maya 7). There is no documentation of FBX on how exactly the pre-rotations are calculated too.

However, Blender does have the capability to import Pre-rotation and pre-rotation-less models and animations without… perceived issue (If I am wrong do correct me!): So why not check if we can bring in that same functionality into High Fidelitys Model reader instead?

@humbletim Tests for the second part should be simple to catch the biggest breakers. Blank Environment, apply tests using script that runs through various poses, with various poses, compare to similar, separate exported model (something that has been confirmed to work before).

The hardware devices after all interface with the javascript, while the javascript interfaces with the animation system.


#17

By convention the default pose of our Avatars is the SAME as the pre-rotated frame. So, when we add a method to access this default pose, you can interpret it as the pre-rotated frame and apply a delta. You won’t have to write down values from within Maya.

All our standard avatars adhere to the same joint orientation, so any procedural animations you create should play back more or less correctly. (Except for the obvious differences in bone length and model proportions). Also, any avatars created by users will also play correctly with our standard animations.

Our hope is to minimize breakage, but at the same time open up support for the widest set of authoring tools. In this case, our animation system was dependent on a FBX feature that is only supported by Maya, a $3,000 authoring tool. This change unfortunately introduced breakage in the short term, but I feel it’s the best thing for the community going forward.

Once the default pose is accessible to you, it should be straightforward to migrate your code. If you can share your code, I can give you advice on how to proceed.


#18

Here’s a PR which exposes the default pose to javascript.


#19

Could you please provide the .blend file for one or more of the avatars you believe work correctly?

It would be a useful test for everyone involved if it were possible to open that .blend file that you provide, export as FBX, import into Interface and see everything “just working.”


#20

I really need to check out the PR build listed above before we can make any more progress with this, but I am unable to install the PR build (details here: https://github.com/highfidelity/hifi/pull/6615)

The tests I want to run

What I’m not clear on is whether the default pose exposed to JavaScript in the PR build is a standard, ‘catch all’ pose, or is unique to each avatar. If not unique, it may work ok, but will most likely not give optimal results. If sub-optimal, it will be interesting to see how sub-optimal - it could be something I can work with, at least for now, as walkTools already provides the ability to add and adjust an extra set of pre-rotations interactively.

However, if the default pose is correct (i.e. based on each individual avatar’s exact rotations for a clean t-pose) then the problem will be solved :smile: But given the Blender lack of pre-rotation values issue, this seems very unlikely.

PR build issue

I’ve installed PR builds before, but this one just won’t play nicely - when I run it, explorer.exe hangs.

I’ve tried the following:

  1. Run current 64-bit release installer - installs no problem
  2. Run 6615 PR build installer (both 32-bit & 64-bit versions) - hangs explorer
  3. Run 6615 PR build installer (both 32-bit & 64-bit versions) as administrator - hangs explorer

Here’s an image showing hung installer processes in task manager:

I note both the 32-bit and 64-bit installers are only 56MB - over 100MB smaller than the release installer. Am I missing something - do I need to have a C++ dev environment set up to run these PR build installers? (I have successfully installed PR builds before, but not since August)

Has anyone had experience with this issue?

  • davedub